View profile

Conversations with Leonid Pekelis, Stanford PhD, Ex: Optimizely, Opendoor

Conversations with Leonid Pekelis, Stanford PhD, Ex: Optimizely, Opendoor
By MirData.Report • Issue #1 • View online
In this issue we chat with Leonid Pekelis, currently Head of Data Science at CloudTrucks and previously the key Data Science person behind several Data-driven companies, most notably Optimizely and OpenDoor.
I was introduced to Leonid back during our days at Earnest (the FinTech Student Loan Lender). Leonid impressed me with his intelligence already back then. Since then he has only continued to take on more impressive projects. Enjoy!

Trust is the most important part of Data Projects
Data Science at Optimizely: Trust is the most important part of Data Projects
Leonid: Actually lost of trust is probably the, like, the biggest risk💀 with data products that I’ve seen. You know, if you’re not careful, it’s pretty easy to show kind of like incongruous results💡. And soon as your users lose trust in something that like, as you mentioned, maybe they don’t understand fully to begin with.
It’s hard to dig that trust back⮐. Right? It’s kind of like, you know, if you’re selling a car, and people, you know, believe that something in the internals of the combustion engine is flaky or not to be trusted, right? Well, no one’s gonna buy the car🚗.
So it’s kind of a similar thing was starting to happen with Optimizely. So what we did is we change the statistics to be to kind of like match the way users use the system. Because as it turns out, like the classical stuff that you mentioned, that you search for on Google only really works if you do kind of like the standard operation, which is you start the experiment🧪 and you divide your groups into 50/50 split. And then you don’t look at the results per se however long, you know, 10,000 users, 100,000 users, two weeks, a month, and then only at the very end, do you look at the results, and you see if one’s better than the other. It only works in that situation.
And the problem is Optimizely, which makes total sense, because it’s a completely online business💻. It was all built around dashboards, right? So you can sit there every hour every day, and look at the significance number trending over time. And that’s like the exact wrong way to do it because there’s some variability there. And you’ll see these kind of like, sporadic spikes, and if you catch those, well then you get false positives.
So what we did is we changed the math to account for that😎. It’s called sequential testing. And it was kind of neat, because this is probably the biggest example or the main example that I have, where we did some pretty deep kind of like theoretical stuff, we end up reading papers, but it was also, it ended up being something that was pretty useful. And from a product sense, because right, well, this stats engine that we built, is still a flagship product for Optimizely 🙆
How Optimizely fixed Trust with their Data customers (Leonid Pekelis)
How Optimizely fixed Trust with their Data customers (Leonid Pekelis)
Inventory Management, Optimal Pricing and Price Forecasting in Real Estate
Data Science at OpenDoor: How to flip homes most optimally and make a lot of money using data?
Leonid: You can think about the home 🏠 price, right?
Like the listed price of a home, you can imagine that if I list a home for a higher price, or a lower price, that would impact how long it takes to sell. As I raise the price 💲, it takes longer 🕓 to sell.
So that’s clear, that makes sense. But now, and also, by the way, like that’s something that I can observe, right? I can look up on Zillow or the MLS or what have you, and I see a listed price for a home. But now the problem is when I go to use this model for forecasting 📶, I don’t just need the price of the home that’s listed today. I also want to see a forecast one week into the future or two weeks in the future.
So now I need to know what the price for the home is going to be two weeks from now ⏭🕔. And, you know, the kind of like the hard thing about this is that kind of the basic things that you would think of doing such as well, maybe it’s just going to be the same price.
So keep the same price that I have for the home now two weeks from now. Well, that’s wrong, because typically, you lower the price of the home over time if it doesn’t sell because that gives you more information that maybe the home isn’t quite as in demand 🅾.
The other thing that you might want to do is just to stick in 🕣🕛, say the average price of you know all homes two weeks in, but that’s also not good because, you know, as I’m showing in this example here on the left, well, you know, maybe the average home in your training data set is kind of like nice country home with, you know, maybe like four or five bedrooms 🏘.
And maybe the average home in your test data is kind of a small cottage, right? And so like just blindly using the average home, that doesn’t make sense either because you’re telling the model that this nice country home, or the small cottage, I guess, in this example, it’s flipped 🤸‍♂, is now worth the same prices as a large country home. Right? And that messes up your results as well 🧟‍♂
How to flip homes most optimally and make a lot of money using data?
How to flip homes most optimally and make a lot of money using data?
Inventory Management, Price Optimization, Forecasting Real Estate
Data Science at OpenDoor: Clearance Rate Models in Real Estate Transactions and Implications of Holding Costs on the Profit
Leonid: So you know, what are kinds of some things that you might want to use a clearance rate model for? And I think one of the core concepts that really came out of this type of modeling work is that when you try to clear any kind of inventory, you don’t just want to look at sort of the price🏷️, you know, the price relative to your valuation of the goods.
So, you know, if you sell it for more than the value, that’s fine👌. But that’s not the whole picture, like really, what you want to do is you want to match that to how long it takes to sell us, especially for houses that have quite high holding costs💰, right. And so, you know, if you sell the house for $100, or $1,000, more than you value it, it’s okay. But if you ended up taking a month longer to do it, well, you’re actually worse off😟. Because you’re paying holding costs that whole period of time. And so what a model like this allows you to do is say, you know, takes few $100 of holding costs per day.
Well, now I can perform a calculation that says, here’s the price I sold it for minus the valuation of the home, I guess minus holding costs per day times the number of days that it was on the market, and that gives you an overall profit🤑.
And so you can start using something like this, to do optimal pricing, right? As I raise or lower the risk, the list price, all of those components change. And maybe there’s a sweet spot🎯 somewhere where I optimize profit. Right? And so the thing is out there in the market, where you have two sides of the transaction. And so, you know, just as OpenDoor is trying to figure out the fair price for a home. So is the counterparty right, and each of us have different amounts of information. And so you can think about it🤔.
If you’re trying to price the house correctly. Well, if your model is wrong, then kind of like two things can happen right? If the model is wrong, let’s see on one direction and You’re mispricing a house by saying it’ll sell really, really quickly, well, then you’re going to raise the price⏫💲 on the house, right? And you’re going to overprice it, and you’ll actually overshoot, and the house will end up staying on the market for a lot longer.
Actually, in real estate, that’s a really bad thing to do. Because people typically don’t behave rationally. And so as long, as soon as the houses on the market for longer than other houses, will people think there’s something wrong with it? Right? And so that kind of carries negative externalities with it. And then you can also be wrong on the other hand, which you think a house is going to take a lot longer to sell, than it actually does. And so you lower the list price⏬💲, and you’re, you’re leaving money on the table there.😱
Clearance Rate Models in Real Estate Transactions and Implications of Holding Costs on the Profit
Clearance Rate Models in Real Estate Transactions and Implications of Holding Costs on the Profit
Did you enjoy this issue?
MirData.Report

Latest information from the data science world

In order to unsubscribe, click here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue