Watch on YouTube Listen on Apple Podcasts Listen on Spotify Listen on Google PodcastsGet ready to embark on a CRO journey with our special guest, Andres Glusman, the CEO at Do What Works. With his knack for behavioral science and strategic experimentation, Andres spills the beans on how to skyrocket your SaaS conversion rates.
Get ready for a deep dive into CRO experimentation programs and insights that will supercharge your SaaS business.
📈 The Power of Strategy in Driving Conversions
According to Andres, you don’t need a team of data scientists or a PhD in statistics to ace experimentation. It’s all about having a systematic approach to testing and learning. Forget the complexity; it’s all about the approach! Andres and his team of 25 prove that you can conquer the CRO world without a single statistics whiz. It’s time to channel your inner testing superhero.
🔎 Statistical Significance Unleashed: A/B Testing Like a Boss
Let’s talk about statistical significance. In this episode, Andres spills the tea on how to master A/B testing for SaaS success. He unveils the magic of your golden pages—the crown jewels of your B2B SaaS website. By optimizing these, you’ll witness a conversion rate boost like no other. But that’s not all. Andres reveals the secrets behind impactful user experience optimizations and the core messaging that truly resonates with your audience.
✌️ From Struggles to Success: Recommendations for Low Conversions
Are your conversion rates in the dumps? Fear not! Andres has got your back. He dishes out actionable recommendations to turn those low conversions into glorious victories. Media analysis, qualitative research, and direct customer conversations are the keys to refining your messaging and gathering powerful insights. And here’s the secret sauce—continuous iteration and experimentation. By designing experiments with clear hypotheses, tracking results, and extracting meaningful insights, you’ll unlock the secret to conversion rate domination.
😲 Remarkable Experiments that Make Your Jaw Drop
Andres also shares some great experiments that have left the SaaS world in awe. These experiments showcase the power of strategic experimentation and its ability to propel businesses to new heights. With the wisdom of industry thought leaders and a data-driven mindset, you too can achieve geometric growth and transform your SaaS business into a powerhouse.
The SaaS universe is a wild and ever-changing landscape, but with the insights and expertise of Andres Glusman, you hold the key to unlocking your SaaS superpowers. Conversion rate optimization is your ticket to growth, and Andres is here to guide you every step of the way.
Ah, and we also have a little gift
Grab our free Experimentation Sheet: bit.ly/43GoSkE
Don’t miss out on the latest stories from top SaaS experts – delivered straight to your inbox.
Quick bio
Name: Andre Glusman
What he does: CEO @ https://dowhatworks.io/
Andres on the Web: https://www.linkedin.com/in/glusman
Todd (00:01.13)
Andres, welcome to the show.
Andres Glusman (00:03.081)
to be here.
Todd (00:04.35)
Yeah, okay, well, maybe you can give us an introduction. Who is Andres Klusman? Yeah.
Andres Glusman (00:10.488)
Who is? I am the co-founder and CEO of Do What Works. I am a behavioral scientist by training. I'm someone who's been involved online and with online experiments since the very early days of the commercial internet. So I'm basically a behavioral science nerd who loves applying the craft to all things business.
Todd (00:33.046)
Yeah, gotcha. And you also, just for context, you worked in the SaaS space for some time. You had quite a senior role at Meetup in the early days and you worked there for some considerable time, right?
Andres Glusman (00:42.268)
That's right. So I helped launch Meetup. I made their first $14 of revenue, had almost every single rule you can have over the course of a decade and a half, eventually leading product and growth. I led strategy, I led community, but broad swaths of the organization, but it was really a wild, really interesting ride. On route from, you know, just getting it off the ground to a full scale acquisition where we had 40 million users at the end.
Andres Glusman (01:04.852)
and successful exit that gave me the ability to go off and start my new thing a few years ago. So it's not so new anymore, but it certainly gave me a lot of freedom.
Todd (01:13.086)
Nice. So before we get into the new venture, which is do what works on your see, I correct me if I'm wrong, the title was it chief strategy officer? Yeah. I always, I always find that people have a different definition, definition for strategy or different understanding of strategy. So I'm actually going to put you on the spot. What is your kind of definition or how would you explain what strategy is or kind of what you did there?
Andres Glusman (01:19.568)
Yeah.
Andres Glusman (01:32.54)
Sure, I love the definition from Michael Porter. And the definition that he has is that strategy is what you don't do. And that to me is the most important definition of it because there's focus. It's what you choose not to do. With an organization at meetup size, when as a chief strategy officer, what you end up having is many, many, many, many teams that are all operating, that all have lots of things they could accomplish. And the challenge with strategy or when you're chief strategy officer at that size organization, it's really around defining
Todd (01:43.798)
Focus.
Andres Glusman (02:02.756)
where you're going and maximizing alignment by basically having the right set of goals in place, getting the right frameworks, getting people generally rowing in the right direction and still giving a lot of leeway and latitude for everybody to operate within their own swim lanes to be able to move the ball forward. So it's basically a lot of orchestrating and a lot of moving pieces.
Todd (02:21.814)
Gotcha, thank you. Cool, so the current venture, Do What Works, by the way, love the name, super descriptive. Yeah, maybe you can explain what you've been doing the last few years. Yeah, explain what is Do What Works.
Andres Glusman (02:26.969)
Thank you.
Andres Glusman (02:33.66)
Yeah, Do What Works is a product that helps growth leaders do what works. It, uh, we've built an engine that has a patented technology that allows us to detect the experiments that are being run by any company on their growth tests. And so our clients who include six of the world's top streaming brands, uh, eight of the world's top SaaS companies, major banks, learning platforms, et cetera, et cetera.
Andres Glusman (02:59.812)
They're all using do what works to leverage the experiments that are being run by others and use that data to generate better headlines using AI, for example, on their search copy or to optimize their landing pages on key experiences on their website so that they can get more results for all the time and money and energy that they spend getting people to their website.
Andres Glusman (03:24.716)
and small improvements on the conversion rates of your website can have a pretty profound impact on your ultimate volumes, on your cost of acquisition, your ability to outspend your competitors. Generally, really, really, really good things happen when you have that advantage or when you can get more results from the money you're spending. And so that's what we do. And what we help our clients really do is to go from kind of wandering around in a fog where they're generally learning every lesson on their own to learning from everyone else so that they sort of start with an advantage and they start.
Andres Glusman (03:54.2)
to use the American metaphor, they start on second base, to use a baseball analogy.
Todd (03:57.634)
Yeah. Yeah. And I guess because if, I mean, a lot of companies that I've seen don't really have very robust experimentation programs, but the idea here is that if you are running experimentation programs, that time to actually come up with the ideation, to design the experiments, to actually build them, to execute, to wait for the results, all that stuff takes a lot of time, a lot of money. And what you're kind of saying here is that
Todd (04:22.538)
a lot of people actually kind of start off making the same mistakes. So instead of starting off on first base where, yeah, you might have a nice idea, you can start off from a better position because you already know the types of experiments that have won with different companies. So why don't you start here and then that gives you kind of huge leverage. That's kind of the principle, right?
Andres Glusman (04:38.076)
That's right. And it's shocking and it's unfortunate that there are two things that people in our industry don't talk a lot about in the conversion optimization business. People who are really focused on growth. The two things that I'll talk about is exactly what you said. One, it takes a surprisingly long time to run an experiment. And the launching part is not the hard part. That's gotten easy. Thanks to Optimizely, thanks to Adobe, thanks to various other engines that make it easy to get experiments out the door. It's easier than ever. At least it's a lot easier than when I started running tests 20 years ago.
Andres Glusman (05:09.496)
The problem now is that once you get a test out the door, which is not trivial just to be clear, but it's not true once you get a test out the door, you now need to wait about a month to get enough traffic to that page, to that experience in order to get results. And so it's kind of shocking to people when they start to put the numbers together, say, wait, if I have to wait a month and I can't run tests in parallel, that means I get 12 shots a year to run and to improve something.
Andres Glusman (05:38.284)
I get 12 tries, which is a surprisingly small number of tries in a year. And then the worst part, kind of the bad news on bad news, is that according to Optimizely, 80% of experiments that people run on their platform do not positively move the needle. So you have this challenge, which is to say, you know there's good things that happen when you can get these wins, but you have a very small number of tries you get a year, and 80% of them are not going to move from the needle. You're going to waste 80% of your time learning.
Todd (06:09.228)
Yeah.
Andres Glusman (06:10.236)
And that's a really, really rough game. Would you go to a casino that offered you those odds, Todd? Would you sit down at a blackjack table where you're like, you're gonna lose eight out of, you know, eight out of 10 hands?
Todd (06:22.478)
Depends how many beers I'd had, maybe if I'd had a few, no, I probably wouldn't take that, I wouldn't take that bet. Ha ha ha.
Andres Glusman (06:25.252)
That's right. Well, it depends on the beers and two, it depends on how much you win when you do win. And so that's the whole goal is to make it so that because when you do win, those wins more than pay off for themselves and they make a big, big difference. But our point of view, like you were saying is, well, everyone has that terrible hit rate because everyone recreates every losing experiment on their own. And it's sort of this thing that if you're not, you know, the experience is what you get when you don't get what you want.
Todd (06:38.06)
Yeah.
Andres Glusman (06:54.316)
And I think learning is what you get when you don't get what you want, when you don't get the win. And so why not figure out ways of learning from other people to jump start your process and avoid wasting a month or two months or three of those months on experiments that are just doomed from the start.
Todd (07:11.606)
Yeah, got it. And I have a bunch of follow up questions. But before I do that, I just want to kind of just jump into kind of the nuts and bolts of experimentation because in my experience, and I think maybe my experience is going to be very different to yours. So we're a small agency, but we work with SaaS companies and many of our clients are in the small to midsize range. So as an agency, you know, you see a lot of different businesses, not necessarily going to work with all of them, but you see under the hood of a lot of different companies. And in my experience, in that
Andres Glusman (07:14.853)
Yeah.
Andres Glusman (07:19.589)
Hmm.
Todd (07:39.606)
Besides the market, I'm sure it's very different when you move to enterprise, there isn't actually surprisingly very robust experimentation program. So I just want to maybe speak to the people that are listening to this that maybe, you know, they're either a CMO, they're working in marketing, they know they need to do more experimentation. But I think there's a bit of a misconception in marketing that we should be, you know, you always hear that phrase, you know, we should A, B test that.
Todd (08:03.318)
But actually, realistically, of course, you can experiment, but to get to statistical significance, to have real outcomes, there has to be some kind of parameters in place. Maybe we can just speak on that loosely, traffic volume, conversion volume, and we don't wanna get really deep into statistics here, but what are kind of the parameters really you need to run statistically significant A-B tests?
Andres Glusman (08:26.692)
Yeah, it is so fun to run tests because you do get to learn. And if you had the ability to test everything, man, would that be cool. You would be so smart about kind of the most minute details of what works and doesn't work. And if you had an infinite amount of traffic, you could do that. When you don't have an infinite amount of traffic, the group
Andres Glusman (08:53.136)
thing you really need to do is focus on being extremely pragmatic. And what that means is being ruthless around saying where are the few places where if I run experiments, where are that it'll make the biggest difference. So the number one thing to start off with doing is to understand where are your golden pages. And I wish I could take credit for this. I stole this from I didn't steal it. I borrowed it from the head of growth at Asana and I promised him I'd give him credit for it. But
Andres Glusman (09:20.132)
Where are your golden pages on a B2B SaaS? A golden page is your homepage, is your pricing page, your signup page, your product page, your SEM landing page, the pages you're driving all the traffic to. There's not that many of them and improvements there have a disproportionate impact on your bottom line. So number one, start by focusing on your golden pages. Two, what are the kinds of changes that if I had a material impact, I would feel on my business?
Andres Glusman (09:47.352)
And so the smaller your business, the bigger the conversion rate needs to be in order to have a meaningful impact or feel it in terms of millions of dollars of revenue, if you're a company with billions of dollars of revenue flowing through your pipes, because you're, you're Microsoft, you're Adobe, you're what you're a huge enterprise company. Great. You're going to get a ton of, uh, you know, small improvements there will represent millions or billions of dollars as you get smaller and smaller. You need the lift to be that much greater. And therefore you need to be prepared.
Andres Glusman (10:14.864)
to focus on the things that are more fundamental to driving results. What I like to think of in that way is meat versus potatoes. So the meat is, well, I'm Argentinian. The meat is what you came for. So it's, it's the core of the, of the dish. There's a, there's the meat on your website and the user experience elements that you optimize, that you change. That's really related to getting the job done, the understanding the value, understanding the discount, understanding the
Andres Glusman (10:43.792)
the framework and what the product can do for you. Those are the things that make the biggest difference. Everything that surrounds it is the potatoes, is the garnish, it makes the plate look pretty, but it generally doesn't look, it doesn't make that big a difference. And what you wanna focus in on is what's the meat and what's the potatoes. If that makes sense on any given page. Now, like what we're doing is we're obviously looking at every experiment that's run on any given page.
Andres Glusman (11:09.592)
And then doing, dissecting all the different variants or the ways in which we see people testing things. But if you sort of abstract outwards, the general principle is if you sort of focus in from the action you want people to take outwards, and what leads to that action, you're gonna find a pretty good path, a golden path on a golden page.
Todd (11:29.11)
Gotcha. Okay, so maybe super useful, but just to be kind of specific on maybe on this kind of statistical side, like, realistically, how many conversions do we need, you know, per month? You know, I think there are different, by the way, there's never one size fits all when it comes to statistical, there's different ways you can chop this stuff up. But you'll hear different, you know, you need 700 conversions, you need 1000 conversions in a month. Do you have like a rough benchmark you use if you're recommending A-B testing on like a marketing landing page, for example?
Andres Glusman (11:44.144)
Mm-hmm.
Andres Glusman (11:59.544)
I usually go to one of the calculators that's available. There's a lot of different tools that are available to use. The thing I think about more so than I think about that like the number, the amount of traffic, and you certainly need a certain amount of traffic working backwards from the cells. And there are certain calculators you can put in there. The most important thing to think about though, when you're determining the volume of traffic you need is your margin of error. Any experiment that's run has a margin of error. If you hear people talk about a 95% confidence interval,
Todd (12:02.774)
Mm-hmm.
Andres Glusman (12:28.58)
It's because there's two and a half percent margin of error either way, plus or minus. That's great if you're looking to put a human being on the moon, or actually not even great. You might need 99.999 if you want to put a human on the moon, or if you want to put a pill into somebody's body, you want some really high statistical significance because you don't want to be wrong. The smaller your error rate that you live with, if you're going for 95% error rate,
Andres Glusman (12:57.016)
want to, that's going to take you a long, long time. If you're willing to drop from a 95 to a 90 or an 80, it means that you can get directional confidence. You can feel more confident, but what you can do in the process is hack off a lot of time that you spend. And so if you're willing to live with a little bit more error in your measurement and your experimentation and not have statistical significance, be at the 0.95 or 0.99, but really be a little bit lower.
Andres Glusman (13:26.136)
you're going to get learning faster. And why would that be attractive to you? Is because now you're instead of having 12 shots a year, maybe you can get that up to 18 or 24. And that's a pretty big difference. And so are you willing to double your volume and live with a slightly larger error rate knowing that you can sort of make up for it over time because you can run more experiments and have get to the truth or hone in on the truth faster by getting more trials. And so the number one thing I'd suggest is actually
Andres Glusman (13:55.02)
Use the calculators, but don't necessarily be fixated on a 95% confidence interval. Give yourself some permission to go down to 90, go down to 85, go down to 80. See where it takes you.
Todd (14:06.347)
Yeah.
Todd (14:06.914)
Got it. I think, by the way, thank you for highlighting the calculator because I tried to make something which is actually relatively complex. It's not like you can just say you need 700 conversions a month. There is minimum detectable effect, there's statistical power, there's like, we need to look at all these things in different ways. So yeah, definitely go look at the calculators. But in my experience, the reason I make that point is because in a lot of cases, people actually don't have the volume that they need to run A-B tests. Even if you lower the statistical significance
Todd (14:36.948)
to 80% then it's still is actually for most people it's actually quite challenging. So you kind of touched on it a little bit there, but how would you, what would you recommend for those people that are in the, you know, 50 conversions a month that probably even if they lower the statistical significance to 80, they use the calculator, they're not going to get there. Do you recommend that they do sequential testing, some other type of testing? Like what would be the advice for those people?
Andres Glusman (14:41.66)
That's right.
Andres Glusman (14:59.868)
So those people are honing in on truth. They're just doing their best. I'm in that camp, by the way. I don't have Meetup volume anymore. I used to have Meetup traffic volume. My clients have that volume. I don't have that volume on my own personal website. So I'm in there. The solution to do is two things, which is one, how do you hone in on the truth? So one, can you borrow the volume elsewhere? Can you borrow it by looking at other people's tests? Can you borrow it by looking at...
Todd (15:08.747)
Me too. Same.
Andres Glusman (15:27.872)
ads on Google, like so your Google ads, your ads on Facebook, your ads on LinkedIn, you can run tests there that you can experiment with messaging, they have a lot more traffic than you do. So the messaging that resonates there, as long as it's relatively close to that stage in the funnel, and you're talking to your user confidently, you can learn a lot about what resonates and doesn't resonate and bring those back to your H1s, bring those back to your website.
Andres Glusman (15:52.8)
as you're kind of creating those landing page experiences. So you're borrowing other people's traffic. You're borrowing their insights to hone in on a truth that will work for you. Is it precise? No. But is it better than shooting darts in the dark? Yes, way, way, way better than shooting darts in the dark. And so to me, it's all about getting signal wherever you possibly can, however you possibly can, and know that you're just triangulating in on a truth.
Andres Glusman (16:19.952)
So your media can give you some, your experiments can give you some, other people's experiments, your qualitative, putting rough prototypes in front of people and watching them use it, having discussions with them, selling your product, like mimicking your sales page. I'm launching a brand new product right now and we put up a sales page on it, but more than anything, I just get on the phone with people and I talk to them and I'm continuously iterating on my message. And it's a relatively inexpensive product, so it's a terrible ROI on my time.
Andres Glusman (16:49.156)
Like the cost of me selling it versus the amount of money that I make from it is terrible. I do it because I'm really honing in on a pitch and I've got a really good idea now what people respond to on this pitch and what the pain points and how you talk about it and how you solve it. And now I can go back to my website and I'm actually using, creating a self-service experience around this new product that we've got rolling out now that I've gone through the pain of being the user interface for some period of time. And you know.
Andres Glusman (17:17.892)
Hopefully we'll get a lot more volume there. We'll be able to start running a lot more experiments. But if we don't, I'm going to have to rinse and repeat or do something similar along the way going forward.
Todd (17:26.13)
Yeah, but one of the things I would say here as well in my own experience, we don't have the conversion volume either to run the A-B tests as the agency, but we do have an experimentation program and just having a spreadsheet where you you actually think about an experiment, you have a hypothesis, you design it, you document it, you have a period in which you're going to, this is the period we're going to run this for, we're going to analyze the results and we're going to actually leave those somewhere. They live and breathe. And then that makes it easier to kind of connect the dots across all these different experiments and to undertake learning. So that's, that's been my own,
Todd (17:56.285)
my home finding.
Andres Glusman (17:57.148)
That's right. And that gives you the discipline of understanding what question are you posing? A lot of the mistakes that people make when running experiments is say, well, our experiment is it got lift or it didn't. And so what did you learn? You learn if it got lift or it didn't. But if you can design an experiment in a way that says, well, let's test tonality. Do we want to go informal or formal? And now you can sort of start to get signal around tonality. Let's test messaging. Do we want to photos? Do we want to go with a lifestyle photo or product photo? Do we want to go with this kind of image or that kind of image?
Andres Glusman (18:25.36)
The more you can systematically vary and gather the insights, gather the learning, have a note around sort of the metadata around it, the more you're going to get value from the winners and from the losers and the more the loser are actually going to really pay off in the long run. And so that's really where that spreadsheet comes in for you. That's really where it's sort of looking at the past history, the storage of other experiments that are very similar can propel you forward.
Andres Glusman (18:52.592)
to give you better insights than it worked or it didn't work, to give you the insight around, yes, it is actually this kind of thing in this kind of location, this kind of message, this kind of approach is what resonates with our users.
Todd (19:05.062)
Not that this was thank you that that was really good advice and not that this was any plug But if you if you're looking for an experimentation sheet to get you started you can go to our website and in the footer You can download one for free. It's not gated. There's just a copy in Google Drive So if you want an experiment sheet, you can you can go there and grab one Andrews I'm gonna shut this podcast for a second if you can hear this
Andres Glusman (19:25.432)
I can't.
Todd (19:25.578)
One second, there's like banging construction going on the whole time you've been talking and I've been trying to zone it out. Just when it starts again, I'll let you know. There you go.
Andres Glusman (19:32.016)
It's.
Todd (19:37.674)
Maybe it's doing a really good job of canceling it. Okay, great. As long as when I'm speaking, you don't hear it, then that's all good. I was just wanting to check.
Andres Glusman (19:37.772)
OK, it's.
Andres Glusman (19:44.408)
I don't hear it at all. I haven't even noticed it.
Todd (19:47.806)
Okay, great. It's driving me a little bit crazy, but I'm going to get this podcast episode. I cannot believe the second I started recording some guy with a hammer. Yeah, fuck.
Andres Glusman (19:55.736)
It's like a dog scratching at your door or like for me the lawn guys show up right when I do a podcast for sure.
Todd (20:02.598)
Okay cool, one second, where were we? Okay, we spoke about this conversion one, okay. Okay, what advice would you give to a SAS marketer that knows they want to do experimentation? Okay, cool. So.
Todd (20:15.798)
I guess what advice would you give to a SaaS marketer, CMO, whatever, you're running a marketing team, they know that they're not running any A-B tests, they know that they want to get into experimentation, hopefully just this conversation is enough to have inspired them a little bit, but I don't know, what advice would you give to that CMO, yeah, that's looking to start an experimentation?
Andres Glusman (20:37.852)
And there's a proverb that says that the best time to plant a tree is 20 years ago. And the second best time to plant a tree is today. And so if you don't have a whole engine running, you want to figure out how to dip your toe in the water. And the number one thing I'd recommend to a CMO, to any organization that looks to want to get involved with experimentation, is to find a way to do it as easily as you can. But two, is to build momentum early.
Andres Glusman (21:07.764)
I started running experiments in the early 2000s. I was exceptionally lucky in that the page that I picked had an opening, had an area that I could work on, and we got a win right away. And it was like a shot in the arm of dopamine. It was amazing. We felt so good. And we just did the math. We're like, okay, we just do this 12 times in the next year, and we're going to double our results. It's going to be amazing.
Andres Glusman (21:34.964)
Of course that didn't happen, but because we had the first couple things, the first thing we did was a winner. It really helped build momentum and enthusiasm for it. And the more people see you win, momentum begets momentum. People want to do the thing that's working for others. People want to get those results. And so the one page, the one area we were working on, other teams started seeing that and they said, ooh, I want some of that. That looks really good. I want some of that.
Andres Glusman (22:04.332)
And so then they started doing it and people get a snowball effect that starts to occur when you can build the early momentum. So the most important thing to do is to find a way to get the results early, not let it linger. There's so many times you hear people say, Oh, well, we got to get these six months of foundational elements in place in order to know that in order to be able to run an experiment and you don't, you can get it going a lot faster than you would think. And then two is to try and do your best to hone in.
Andres Glusman (22:32.56)
to get a win in those first three, that bats. If you don't get a win early on, if you have three losses in a row, you're perfectly normal. It's within the boundary of what is supposed to happen. It just hurts and you're gonna have a lot of people in the organization say, well, I never wanna do that, or this is terrible, this is a waste of time, why are we wasting all this time and money? So, figuring out a way to get the win early, and if you don't get the win early, the other really important thing is to have that, set the expectation for the organization that...
Andres Glusman (23:01.748)
It is normal to win one in five times and that if you can win two in five times, you are a hero. If you can win three in five times, you're off the charts. And that's the important thing to be able to make sure that people have the right set of expectations to generate the enthusiasm or momentum going forward. And if you don't do that upfront and people assume that your first thing is going to win when you've got two losses in a row, three losses in a row, it's curtains for the program.
Todd (23:27.286)
Yeah, that's incredible advice and I mean...
Todd (23:30.282)
If we haven't inspired people enough just from just from what we've already spoken about, I think now is just a really good time to do it with the economy, particularly in the SaaS space. Tech space been hit pretty hard. There's definitely not as much money in the ecosystem. There's not always money to fund growth. I think now is a good time. Wait, there's never a bad time to experiment. Let's be honest. But I mean surely now is a good time to, you know, take stock. Let's experiment and what we already have. Let's make our assets better and try and maximize squeeze as much as we possibly can. I mean to me that's just.
Andres Glusman (23:45.477)
Yeah.
Andres Glusman (23:58.261)
No, especially in the B2B space, it's especially true, which it costs $200, $300 per new customer sometimes or if not more. Small improvements there have a pretty profound impact on that cost of acquisition. And so if you can get more results from the money you're spending, would that need to spend more? That's kind of nirvana. That's the best possible thing that you could tell a CFO.
Todd (23:59.51)
Common sense.
Andres Glusman (24:23.384)
And you know, for a marketer that it's like, you're lowering your cost of acquisition, your CAC, if you're trying to get money from VCs, lowering your CAC is an amazing way of generating, getting a proof point to be able to compare that to your lifetime value to get more funding. Lots of good things happen if you can squeeze more, more juice out of that traffic that's coming to your site.
Todd (24:43.382)
100%. And I think there's been, I don't want to say there's like a movement. There's, you know, marketing attribution is really, really tough right now, right? So it's like, you know, we can't measure everything. This is almost like this anti-performance marketing measurement movement in some respect. And I think, you know, CRO is...
Andres Glusman (24:59.772)
Mm-hmm.
Todd (25:03.126)
just flies in the face of that. I just don't think there's ever really been a better time to experiment and to measure and everything, like you said, to maximise and get the most out of your existing assets. And also just to build on what you said, I think if you are that CMO and you do get the wins under your belt, and exactly what you did at Meetup, you then go away and you extrapolate that out and you say, well, actually now, if we look at the revenue contribution over this time arc, this is how much extra money this has made. And I think if you can demonstrate that and prove that.
Todd (25:30.134)
then that's where you can get extra budget, you can hire people to, yeah, really ramp it up.
Andres Glusman (25:34.324)
The math there is pretty profound because there's actually a couple of things in there that are beautiful, which is one, you don't just get that customer revenue, you get the lifetime value from that customer revenue. So all the revenue that person is going to generate gets pulled forward. Not pulled forward, but it's kind of coming to you as long as everything else holds. The amazing thing though is that if you're kind of working through in a sequence and you're sort of iterating on the same experience over and over again, the wins compound.
Andres Glusman (26:03.012)
So a 10% win followed by a 10% win is not 20% improvement. It's a 21% improvement. It's 10% and then 110, 10% of 110. So therefore it's 121. So it's a 21% improvement and each win on top of that compounds. And so it's not just that you're getting, you're stacking these up. They're actually gonna become geometric as well. So the math is really, really favorable.
Andres Glusman (26:32.332)
is the good news. The bad news is that it's not easy to do. It's hard to do. And you gotta be able to figure out how to squeeze the juice out of that. But it's worth doing if you can, because the outcome is so.
Todd (26:38.086)
official.
Todd (26:46.51)
Cool, thank you. Okay, so I think we've set the stage for experimentation. Maybe we can talk about do what works. And I guess what I'm interested in, which is a difficult question for you, is to try and find out what does work. Or maybe another way is, you help people to bypass those kind of wasted experiment dollars on base one. So either kind of, what can you share from do what works in terms of how this could help these SAS CMOs?
Andres Glusman (26:59.875)
Mm-hmm.
Andres Glusman (27:16.096)
It's been an amazing journey. I built this thing originally because we were super curious about this. My partner and I built it because we were both super curious. We've now analyzed over 15,000 experiments. We meta tag, we add a lot of data on every single experiment we see on mobile web, on desktop web, uh, anything by specific page, by specific sector, by specific element, like imagery. We don't just look at imagery, but say like.
Todd (27:31.512)
Circle.
Andres Glusman (27:43.036)
Photos of humans versus photos of products versus photos of hands holding a product versus cartoon images, et cetera. Button color, button this, button that. Anything, any element that people have or people experimenting with, we add metadata around. And what that means for us is that initially the coolest thing was just being able to look at an individual test and be like, oh wow, Netflix ran that test and I can see their result. That's really neat. Now what we're starting to see though is patterns emerge. And so you can see two companies.
Todd (28:04.163)
So cool.
Andres Glusman (28:12.188)
test button color, three companies test button color, four, five, six, seven. And one of these things that I saw as I was starting to watch people run these kinds of experiments is, button color is like one of the easiest things you can possibly test. It's like you just got this new toy and you're like, okay, the first thing we're gonna do is make all the buttons red and all the buttons green. And invariably what I've almost always seen is that when people run those experiments, they almost never win. The experiments related to button, like all the buttons being changed.
Andres Glusman (28:43.212)
marginal impact of us. And so it's one of those things you're like, you just wasted a month on that thing, right? You can varying up the color, varying it up in a way to figure out how to call something out, I think is actually quite interesting. If you wanna make something stand out, but if it's just everything in one way, everything in another way, for most companies we've seen, especially B2B SaaS, it just doesn't make a difference. It's sort of a waste of a cycle. The, those are the kinds of patterns you see over and over again. The other really amazing thing is you see sort of trends.
Todd (28:46.294)
No surprises there. Yeah.
Andres Glusman (29:12.624)
We saw in the United States around the Super Bowl, Coinbase ran an ad and the ad had a QR code. And suddenly in the next year, almost every marketer embraces QR codes to try and include them. So we started seeing in the previous year, people running a lot of experiments, throwing QR codes up on their website for the non logged in user to motivate them to download the app from their desktop to get them to jump over.
Andres Glusman (29:40.572)
And you're like, oh, that's really cool. It's this brand new emerging trend. And you often see these things emerge where everyone starts copying everyone else because it's cool and new. It didn't work almost a hundred percent of the time where we see people run that experiment on a perspective, potential users, it just as quickly fails out and they test away from it. Those are the kinds of patterns that we start to see. And so it's really amazing to see when you start looking at individual tests, but patterns, those are the things that jump out. But especially things that are like
Andres Glusman (30:10.084)
button color that are obvious like that, they almost never make a difference.
Todd (30:16.042)
Yeah, with Do What Works, I don't know how your technology is fuelled. You said 15,000 experiments and you mentioned Netflix there. I don't know if you're able to share that, but where are the experiments cultivated from? Where are you taking that data from?
Andres Glusman (30:34.753)
They're all harnessed from publicly available information that we're able to gather. So we gather the information much in the same way that Google gathers information on the entire web. So Google has crawled the entire web. We just look at the world's top 1600 plus companies and we're able to look at them in a way that reveals to us when they're running experiments and our algorithms and our engine, our patented technology allows us to detect the experiments and to kill our winners and losers.
Todd (30:37.718)
Mm-hmm.
Andres Glusman (31:01.424)
based on the experiments that we're seeing. And, or the patterns that we're seeing. And then looking across multiple companies, looking across multiple experiments, running the same thing, draw pretty strong conclusions on the likelihood to which something is likely to win or lose.
Todd (31:18.742)
Wow, that's super, and you have a patent on that as well. I'm kind of intrigued. It's a bit of a side question, but how challenging was it to get, you know, I've never got a patent myself, but how difficult is that process? I'm sure you have patent attorneys, but yeah, was it challenging?
Andres Glusman (31:23.545)
Mm-hmm.
Andres Glusman (31:28.431)
Heh.
Andres Glusman (31:32.216)
We, you know, we, we thought about it. We got the patent mostly as a defensive maneuver. We just said, well, let's just make sure that we put it out there so that we don't get harassed in the future. And so we put an application in and. Surprisingly about a year later, they said, okay, like they got, we got a question back from them, we answered the questions and it sailed through. It was really surprising. I did not, I know it's getting harder and harder to get patents. And so I was really surprised that we were able to.
Andres Glusman (32:01.808)
to get it as easily as we were. Credit to my CTO actually managed that process. So credit to him and to our patent lawyers, probably for doing a great job of putting it together. But it was much, much, much easier than I thought it was gonna be.
Todd (32:14.99)
Thanks. Okay. Back to experimentation and trying to inspire people to do more experimentation. Well specifically talking about conversion rate optimization as it pertains to marketing. The vast majority and you can maybe you disagree but the vast majority of CRO work in my opinion is research, right? Actually, you know, doing the customer research, trying to understand what the actual problem is to come up with a hypothesis like doing that research is kind of where the gold is.
Todd (32:42.682)
not always, but then to actually build and to execute the experiment is actually less of the work. So if you kind of agree with research is a really big chunky piece of the work within a marketing context, what do you believe are kind of the one, two or three, the 80-20 of kind of like research? Where would you spend your time on research if you were in marketing?
Andres Glusman (33:04.028)
It's a great question. The reality is exactly true, which is that because it's easy to get it out the door and it takes forever to get the results and your win rate is low, it makes a lot of sense. It's rational to invest more time upfront to try and improve that win rate or to improve the impact you're gonna get from it. And the research that you get drives your ability to innovate in general. There's few key sources you wanna look at for it.
Andres Glusman (33:34.012)
for insights. One of course is start with your customers and understand kind of who, have a clarity on what kind of company you're targeting, how and who they are. You'd be surprised how few companies talk to their customers and how much you can learn from having relatively open-ended conversations with customers reacting to your own product and reacting to how people use your stuff. So just having them come in, having them get on a Zoom call.
Andres Glusman (33:58.744)
Watch them, give them control of your product and watch them use it and talk through will reveal so much that you would never learn through any A, B tests, through any outside experts coming in. You're gonna learn a lot very, very, very quickly with just a handful of tests, six, seven, 10 sessions where you're watching people use your website to do X, Y, Z is great. You can learn just as much.
Todd (34:22.22)
Yeah.
Andres Glusman (34:25.136)
from watching them use your competitors' websites or from a company that's doing something similar to yours. So if you're thinking about doing a new pricing page and you say, hey, I really like the pricing page that Calendly has up, you could actually usability test Calendly, even though you have nothing to do with Calendly. You have no idea what, you can get people responding to the way in which it's laid out or to the way in which all the information is conveyed. You can just as easily, or what's interesting as well that I love,
Andres Glusman (34:54.716)
is get research not from your direct competitors, which is always helpful to learn from your competitors because they give you signal about your customers and the needs you're solving. But the innovation you can get is by looking at companies that are in adjacent spaces. So there are, like we were talking about earlier, maybe you don't compete with Calendly, but you are inspired by their pricing page. You are inspired by it. And what we really encourage people on our platform to do, our clients,
Andres Glusman (35:23.544)
is to not just look at and track their direct competitors in the streaming space, for example, but to then also look and say, well, what are people doing in the learning space? What can I learn from masterclass? What can I learn from these companies who are selling digital subscriptions related to X, Y, and Z in terms of how they convey their, their value prop or how they're conveying a discount. What is it that we can learn from all the black Friday sales that happened last year from other people that are not competing with us?
Andres Glusman (35:51.16)
let's draw those lessons in and pull those forward. So where innovation comes from, more often than not, comes from learning those lessons or by studying and understanding what experiments others have run. You can ask them for it. By the way, there's no reason why you can't reach out to people and ask them for their results. You can exchange in exchange for X, Y, and Z. That's what we're looking to facilitate. We're obviously trying to facilitate that at scale, but there's no reason why people can't try and approximate that at a smaller scale. The number one thing I'll just say,
Todd (36:19.468)
Yeah.
Andres Glusman (36:21.156)
which is the hardest thing to do, is there's so much conventional wisdom. There's so many things that people learn by looking at who's hot right now. And more often than not, it's like the wrong lesson learned. You'll see people see, you know, Notion becomes really, really, really popular. It's great, it's taking off. They have a lot of caricatures on their site, little illustrations that are super cute. Everyone's like, oh, illustrations is what we need to copy. Is it?
Andres Glusman (36:49.156)
Is that really the thing that made Notion successful or is it the fact that they have a really viral product? Is it the fact that they've got a great referral program, that they kind of have really generous ways of using the premium services? Is it that it solves a problem really well and that you need other people to have to be a part of it in order to use it? It's sort of making sure that when you're looking at your comparables or where you're drawing lessons, you're drawing, you're looking at the right set of people to learn the right lesson.
Todd (37:13.598)
Yeah, great advice. I also think as well, there's just no, there's no substitute for having really solid positioning to really, really understand who your customer is, what specific value offer to that customer. And then to have a very, to be able to very clearly articulate that to your customer. So just to have really good copy. I mean, of course, design, obviously, is super, super important. I always think, um, the amount of time you invest in design is directly reflected in how much time
Todd (37:42.474)
other people invest in it. So you have to invest time into good design. But I think the most important, if you have shitty positioning and not a very well defined market and you poorly communicate it with your copy, you can have nice icons and graphics and beautiful design and it's just going to fall flat. That's always been my experience.
Andres Glusman (37:45.563)
Mm.
Andres Glusman (38:04.396)
Yeah, design is always the number one draft pick. If you can figure out, if you can get a great product designer on your team early on, and have the right positioning around it, it's a, boy, the results follow very, very, very quickly. Ha ha ha.
Todd (38:18.402)
Just a follow up question on the usability testing. So I really agree. I mean, just to have, if you've never done that before, like there are different platforms that you can do this as well. For example, I'm just naming a few, like Usability Hub is one. I think Winter from CXL PEP, the spin-off company Winter, they do like message testing. But to have people that actually go to your website to use it to give you real feedback, I think is great. I know back in the early days in Meetup, you kind of...
Todd (38:47.826)
I don't know if that's the right word, but you kind of made your own usability program, which was having people come into the office regularly and to do this on a consistent basis. But in 2023, because I think that's such good advice, how would you recommend somebody actually go and do that? Are there specific tools you use? Do you think that they should go and, you know, source those people themselves and like get them to come into the office? Like, how should you do that kind of today?
Andres Glusman (39:11.744)
It is so valuable and important to do. And the answer is by any means necessary. So when I was doing this, none of those tools existed. So we had to make up the stuff. There was no usability test user testing.com. There was no winter. There's no, uh, user ability hub. All those things were just didn't exist. And so we basically just had to invent our own ways of lowering the barriers to getting in front of people. We would invite them in and just recruit them off of Craigslist and have hundreds of people come through the office every single year.
Todd (39:26.858)
Yeah. Yeah.
Andres Glusman (39:40.556)
and watch them use our product in really rapid iteration cycles. One of the things we did that an old friend of mine just reminded me of that I completely forgot is you could do as little as, what is this? We used to Shake Shack. It was a very, very popular thing that opened and there was one location in New York City and on a beautiful summer day, it had a line that would stretch like almost a block. It would take 45 minutes to get.
Andres Glusman (40:09.672)
to get your food ordered. And what we would do when we were testing the app is we would go and approach people in line, and they were basically our target demo, we would approach people in line with our app, and we'd say, if you agree to sort of watch us, if we can watch you use this thing for the next five minutes, we'll buy you your burger. And so for, for six dollars, you know, for the price of a burger, we would get this feedback, and you just kind of work your way down the line, and they're bored, they're just sitting around waiting.
Todd (40:26.594)
Wow that's super smart. I love that.
Andres Glusman (40:37.808)
You know, so you can go do that. I don't recommend doing that to people in coffee shops who are working and interrupting them from doing whatever they're doing and stopping them billing. Hey, will you go look at my product? Um, that may or may not work, but that I've had a lot less success for that to tell the truth, but finding a captive audience, it's just a question of sort of finding people who have a moment and have a desire to give feedback and giving them just enough of a nudge to an incentive to do so in this day and age in 2023. With zoom.
Andres Glusman (41:04.888)
with the ability to have these tools available with, with screen sharing technology, man, it's so much easier just to put stuff in front of people and have them react. And that's probably the number one way I'd recommend using usability sessions is around using your prototypes, using your product, using your competitor's product as a prompt that stokes a question that facilitates a conversation. That's ultimately what you want to use. And with seven to 10 of just.
Andres Glusman (41:34.324)
enough of the right kind of people that are similar enough to each other that approximate your user base. You can learn where the boulders in the road are and that's your whole goal. You don't need 99.999% significance. You can have a lot of error. You just need to figure out the fundamental pieces that are broken or that are suboptimal that you can improve and not react too hard to any one session. Gather three, four, five, ten.
Andres Glusman (42:00.624)
But really at about 10, you've heard the same thing over and over and over again. People are stuck on X, Y, and Z. You don't need to do another 40 tests to know that. You'll get very, very clear if you start hearing the same thing over and over and over again, and then you can rapidly iterate. So just putting yourself in a position where you can do that. And for me personally, like when we're launching something, I'm trying to get in front of people as early as I possibly can and have them use or give me feedback. And if I've been working on something,
Todd (42:07.115)
Yeah.
Andres Glusman (42:29.356)
and it's been weeks since I've gotten feedback on it, I'm like, ugh, I feel bad. I feel like, I feel uneasy. I feel very, very uneasy. Cause I love just getting that little touch base to just know that you're in the right direction.
Todd (42:42.314)
Yeah, another small piece of advice for people that maybe haven't ever done usability testing before, one really low hanging piece of fruit, everyone talks about hot job. Just looking at recordings and seeing how people interact with the page can be somewhat useful sometimes. But I think another really easy way to get started is if you don't want to recruit people, is to ask people that do convert. So people either book a demo or people that maybe do a free trial, ask them what almost made you not convert today.
Andres Glusman (43:11.134)
Man.
Todd (43:11.394)
or you can think about questions you can ask those people as well.
Andres Glusman (43:14.936)
And use your sales process as a way of gathering that insight. Absolutely. Yep.
Todd (43:19.05)
Yeah, for sure. Cool, just going back to experimentation. So do you have a, I mean, there are no shortage of opinions.
Todd (43:29.982)
Everybody has an opinion, everybody has ideas, and I think when you start doing this research, usability testing, you're probably not gonna be short of ideas to test. How do you go about prioritizing? And there are different frameworks for this. I think the Pi and ICE are two of the most well-known ones, and there are other proprietary ranking systems out there, but how would you advise someone to prioritize their experiments?
Andres Glusman (43:55.664)
I'm a fan of Rice version. So reach, impact, confidence, effort. Reach is quite obvious, right? So reach is how many people can this go to? So that's where we're talking about golden pages. How many people will this affect? How many of our customers will this affect? So obviously your signup page will have fewer fewer people reaching it, but it has a pretty profound impact on your overall conversion, right? So all your customers went through that. So it's a pretty good thing to be working on.
Todd (43:58.066)
Okay, nice.
Andres Glusman (44:24.28)
So number one is reach. Like how many people are I gonna touch? How many people could this affect? Is a very, very big question. Or not just people, but potential customers is number one. The ICE, the other one, so it's reach, impact, confidence, effort.
Andres Glusman (44:42.488)
Effort is relatively easy. So the easiest thing to know is reach. Effort is relatively easy in that you're going to estimate how many weeks it's going to take you to launch something, how many days, how many hours, and then you're going to double or triple it. And that's going to be more or less kind of an accurate notion, right? Uh, where it was just too optimistic. The middle two are unfortunate in that they're good, but they're fictions. They're works of fiction that you're using to put math around. And so you just have to know that you're dealing with fiction. Impact.
Andres Glusman (45:12.396)
You have no idea what the impact is going to be before you launch it. Cause why would, if you knew the impact, you wouldn't run an experiment. And so, you know, you can have hope. You can basically say, if I, if you can run scenarios, if I get an impact of 5% on this reach, does it make a difference? If I get an impact of 10, 20, 30, will I be happy? That's the question you can ask with impact, but you're only going to know at the end of the experiment. Confidence is also very similar.
Todd (45:14.498)
So subjective.
Andres Glusman (45:37.1)
And that you actually don't have a metric on that. We've been working to create a confidence metric that we call a bet score. That relates to the aggregation of all the experiments that are similar to yours in your space. And so we have a bet score that we produce. That's kind of from a scale of one to 99. Where one is like, there is a snowball chance in hell, this thing's going to work. And 99 is that this is so consistently a winning pattern that you can bet very aggressively into it.
Todd (45:49.592)
Bye bye.
Andres Glusman (46:06.484)
And obviously it's sort of anywhere in a 50, 50 around a 50. It's, it's a bit of a toss up. And so what we do when we give recommendations to clients with our professional services, we basically give the recommendations. Here's what we think is suboptimal on your pages. Here's what you should do. And every one of those things has a bet score to give a level of confidence to be able to stack rank. But what that ultimately does, the job of it is just to know, should I do this today or do that today, should I work on.
Todd (46:11.724)
Yeah.
Andres Glusman (46:35.28)
product A or product B or product C or product D and just to stack rank and devote your effort as best as you possibly can to the things that you hope, you have a hunch are gonna make a difference and you're using data to approximate the truth and just do better than shooting blindly. But the one interval, the one of those, of the two, of the four metrics there, reach, impact, confidence and effort, I feel the middle two are the haziest.
Andres Glusman (47:02.636)
And the only one that we're that I think the one you can most legitimately approximate with limited data is going to be the confidence. So that's where we focus.
Todd (47:11.178)
Yeah, nice. I think you get a nail on the head. If you start getting experimentation, don't shoot blindly and don't just put your finger in the air and randomly pick an idea. Have some logical ranking system. And there are different ones out there. You can research this stuff. It's not hugely scientific. It's just taking the, yeah, the sum of those different scores. So cool, moving on. What about...
Todd (47:34.482)
You must have obviously seen so many different experiments, I mean, 15,000, I mean, you've also been doing experimentation for a very long time. Kind of intrigued if there's any really interesting experiment stories because of an incredible result, because of whatever reason, but yeah, do you have an interesting experiment story?
Andres Glusman (47:51.088)
Yes, my favorite experiments are the ones that proved to me I'm wrong or teach me something like humanity and that forced me to look at the world in a different way or question conventional wisdom. And there was a great moment at Meetup where we had this challenge and at Meetup there's two kinds of users, there's members and there's organizers. Organizers pay Meetup money and they organize the event. So they're very, very, very important.
Andres Glusman (48:19.664)
There was a problem that meetup had, which is that because me have did such a good job of promoting groups to people and getting people to join groups, it attracted people who are a little sketchy. There were people who wanted to create meetups that were sort of fake meetups that were really just marketing a product. Come, come to my, you know, come to my meetup and learn about essential oils. Well, no, that's not really like what people are subscribing to the who, who want to hear about it. Right. So there's these people that would sort of abuse the product.
Andres Glusman (48:48.62)
And our CEO, Scott, said, what I really want to do here is I want to add an affirmation, a check mark that you have to click in order to create a group where you have to promise to create real life community as a put and you're not a sketchy business person. So he wanted to add a step in a purchase flow, sign up and purchase flow that raised the commitment level and added friction.
Todd (49:17.439)
Increased friction, yeah.
Andres Glusman (49:18.636)
Exactly. And we said, Scott, that's a terrible idea. It's going to kill our results. We're going to lose a lot of revenue. We're going to have hard time paying for all these people were hiring, et cetera, et cetera. And he's like, well, I really think we should, I said, well, let's run it as an experiment at least so I can prove to you how much money we're going to lose. So we launched this thing and we include this checkbox in there and we watch it for a few weeks or about a month and lo and behold results not only did not go down.
Andres Glusman (49:48.46)
they actually went up. We had a 16% lift. And so it's one of those case studies that I love because there's so many gurus out there that will basically say like, this is the conventional wisdom. This is what you need to do. You need to have this kind of social proof. You need to reduce the friction. You need to have this down the other. And it's all based on just conventional wisdom. It's not based on data. It's kind of shocking. And so every time I see an experiment like that where
Andres Glusman (50:17.452)
It doesn't give me the answer because I actually don't 100% know why adding that friction improved the results. But what it did do is it caused us to think a bit more about the questions around why and to start probing or exploring from there. So we said, well, I wonder, is it because we're reminding people of the value of the re in real life experience? Is it because we're basically saying
Andres Glusman (50:39.768)
You're not going to be just another, you're not, you're going to be amongst a group of people who have a shared value system as you who really believe in this thing, you're going to be in a more exclusive club because other places are where sketchy, you know, marketers go to abuse the system, but this is not one of those places. So you're going to be in a standout place. Maybe there's another reason, maybe just reminds them more of something else. All these things were basically questions that we could then pose that we could put into our usability lab. So we could ask users about that. We could explore with the following experiments to learn more about what actually made a difference.
Todd (51:11.294)
Yeah, it's super cool. I mean, yeah, when it comes to running experiments, you really do realize how often you get it wrong and how your assumptions about something are completely wrong and there's something beautiful in that, isn't there? That you thought this, it's great. You have the data to back it up. So I think, yeah, that's kind of the beautiful thing about experimentation. One thing I would say though, when it comes to in my observation, this is just, yeah, in my own experience, I often do find that you can get
Andres Glusman (51:24.379)
I love it.
Todd (51:41.25)
Best practice and conventional wisdom can work to a point, and let me explain what I mean by this. Often, I will often work with companies and they pretty terrible at writing copy, structured landing pages. They're just basically the benchmark is super, super low. So if you have like a really terrible landing page, you kind of can implement in my opinion, best practice, conventional wisdom, and really go in there and not have to experiment to get that benchmark up. But I think once you've kind of.
Todd (52:08.594)
really lifted the bar, I think then it's difficult to then find those big wins. That's only been in my experience anyway.
Andres Glusman (52:15.928)
Yeah. And there's really bad user experience. There's, there's kind of bad practice. There's a familiarity, which is great. And so when certain things start to conform to a familiar pattern, it becomes easy to learn. You're not teaching people how to use the product. We've had folks that we've worked with that had really bold, cool websites that like flew that the, when you scroll down, it actually didn't go down. It went diagonal. And it was really just a feat of engineering is really neat.
Andres Glusman (52:45.284)
But it was so confusing. Like I really couldn't, you know, and you're like, that's neat, but it's such a pattern interruption. It's so different that it's not at all good. On the flip side, you have things that are so conventional wisdom that everyone sort of looks to expect them and becomes blind to them. And so in a lot of ways, the way in which you don't even notice advertising on certain websites, you become blind to them. And any company who wants to put,
Todd (52:47.39)
I hate those websites. Yeah.
Andres Glusman (53:14.572)
something useful in a spot where you would expect advertising to go is going to not have that thing seen. Basically, right? It's a pattern and users have learned that pattern. And so you definitely do need to understand what the general pattern is and conforming to it is great to a degree. But then leaning too hard into it or once it's there, it actually becomes meaningless and you don't need it is my point of view. So there's a lot of things that are conventionalism that people will tell you you have to do.
Todd (53:22.952)
Yeah.
Andres Glusman (53:44.108)
I don't think you have to do them at all. I think they're actually a waste of time.
Todd (53:47.958)
Yeah, got it, thank you. Well just to switch gears, just maybe on a more personal note, what kind of, yeah, what gets you out of bed when it's not work? What do you find interesting? I had surfing maybe on another bio, somewhere, I don't know if you're still surfing, but yeah, what do you do in your spare time that interests you?
Andres Glusman (54:05.548)
Yeah, I live near the New Jersey shore, which is wonderful, especially this time of the year. It's springtime here. I do have a longboard. Unfortunately, since I've launched this venture, I have not gotten on my board once. I got my daughter out in the water and I pushed her on the board a few times, but I actually have not been surfing like three or four years. I'm going to take care of that this summer. I have four dogs and I love hanging out with them. I have a wife and two kids and I love hiking.
Andres Glusman (54:34.953)
So those are my current adventures at the moment. And yeah.
Todd (54:37.566)
Yeah. Is the company, sorry to interrupt, is Do What Works a fully remote company or? Yeah.
Andres Glusman (54:44.204)
Yeah, it's a fully remote company and it's been remote since day one.
Andres Glusman (55:12.336)
Georgia, somebody in the UK, somebody, a lot of people in New York, all over the world right now. And it really allows us to hire for the absolute best talent and to give them a ton of flexibility in where they want to work. It's a pain in the neck as an employer. I'll tell you that much. I didn't expect this, but it's a real pain to have every time I hire somebody in a new state I have to go through a whole registration process. It's a huge pain in the neck.
Todd (55:34.186)
Yeah, that's super. Yeah, that's one of the really challenging things. We're a fully remote company as well. We hire people in multiple different countries. We're a smaller company, so maybe it's a bit easier to kind of navigate. But look to these third party companies where you can set up like a, I don't know how they call it, like an entity in a different country. And then you hire them to. I don't know if this it's super complicated, right? Because then if you hire somebody in Dubai, for example, you've got different employment law, there are different public holidays, there are different ex-wives paternity leave or, you know, whatever the laws are. So, yeah, navigating that.
Andres Glusman (55:53.057)
It's such a pain.
Todd (56:04.16)
think is a little bit challenging but yeah nonetheless you still get to hire like you said the best people around the globe and I know for us we I know if you have people in different time zones is that something you have to deal with or you kind of all truncated into oh yeah yeah
Andres Glusman (56:17.636)
No, we do with a wide swath of time zone. Wide. It's such a pain. There's one person who phones in at midnight, or like 11 PM. Another person on the opposite coast of the United States who phones in early in the day. Having a team meeting, there's one slot during the week where we can really nail it. Otherwise, it's just impossible.
Todd (56:21.94)
Oh wow.
Todd (56:36.03)
Oh wow. Oh man, I can imagine. Cool, well listen, Andreas, thank you so much for your wisdom and advice. I've personally found this conversation super interesting because I love experimentation too and hopefully we inspired some people to start experimentation or to certainly improve the experimentation programmes they're doing. Where can people find you if they so want to?
Andres Glusman (56:56.58)
Yeah, the two best places to find me are one, my website is do what works.io. And then I tend to hang out on LinkedIn. I like LinkedIn. So Andre Excluseman at LinkedIn. Well, you'll find me there. That's where I hang out.
Todd (57:10.142)
Yeah, cool, I'll manage you in the shows. Andres, thank you so much for your time, much appreciated and speak soon. Thanks man.
Andres Glusman (57:14.224)
Right. My pleasure, Todd.
Share