>>Ashkahn: Okay.
>>Graham: All right.
>>Ashkahn: Welcome, everybody.
>>Graham: Hello.
>>Ashkahn: This is Ashkahn.
>>Graham: I am Graham.
>>Ashkahn: And we are ready to slam and jam.
>>Graham: With the question, which is, "do you ever run A/B tests at Float On to test
the effectiveness of marketing strategies?
Or have you ever heard of other centers doing this?
If so, any suggestions are appreciated."
>>Ashkahn: That's what they wrote, appreciated?
>>Graham: That's just how you say that.
Right?
>>Ashkahn: That's the- >>Graham: Yeah, I appreciate you pointing
that out.
>>Ashkahn: No, that's not correct.
>>Graham: Yeah, so, A/B tests, otherwise known as split testing.
>>Ashkahn: Otherwise known as onesie-twosies.
>>Graham: That's not true.
>>Ashkahn: I think so, I'm pretty sure ... >>Graham: That's absolutely-
>>Ashkahn: No, they threw that on a website- >>Graham: Actually just go Google, yeah sure-
>>Ashkahn: ... with A/B tests and they found that people-
>>Graham: Go Google onesie-twosies and see what comes up.
>>Ashkahn: ... preferred onesie-twosies.
>>Graham: Let's see, A/B tests, split tests, basically just testing two different variations
of something- >>Ashkahn: So this is-
>>Graham: ... on your website.
>>Ashkahn: This is kind of weird to realizing is happening to you if you've never heard
about this before, but this is incredibly common when you go onto a website of some
sort of company on the internet.
They often will show you ... You will get one of a certain number of versions of the
website.
So you may actually be seeing something different than somebody on another computer going to
that website.
Oftentimes, this stuff can be pretty subtle.
You know, it's like- >>Graham: Colors or buttons, or something's
bold versus not bold.
>>Ashkahn: Or it's big, like, "Hey, we're showing you this as our main homepage."
Like call to action versus a whole completely different call to action for somebody else.
The basic idea is that if you show half the people one thing and the other half the other
thing, then you can track how many people clicked over to the thing you wanted them
to do, or took whatever action you wanted to do, usually leading to people buying things.
You can just see like, "Hey, when we made this button green 10% more people clicked
on it than when it was orange."
>>Graham: Science!
>>Ashkahn: So, yeah, like, "Okay, I guess we should make it green."
>>Graham: Yep.
I guess it's worth mentioning that, with A/B testing, you're dynamically choosing, as people
visit the site, who's being presented with one version versus another.
So you're kind of randomly assigning them in real time, as opposed to having your site
one way, let's say you have a big blue button or something, and then you've had it that
way for months, and one day you're like, "Oh, I wonder how a green button performs?"
So you switch the button to green on your site, and then you kind of compare this new
data versus the historical data with the blue button.
That's a way to run tests, but A/B tests are nicer because it controls for a bunch of variables.
Maybe people clicked on the green button more because you happened to run that around December,
and December was the strong month as opposed to it actually being the color that made the
change.
So, if you're splitting people live you're just controlling for every other variable
except the change you're making.
So that's kind of the preferable way to do it.
There's a ton of advanced tools out there now for A/B testing, split testing-
>>Ashkahn: Onesie-twosies.
>>Graham: ... so you can get pretty nuanced and complicated with it.
>>Ashkahn: Yeah, so this is something that you would get as-
>>Graham: That's not what it's called.
>>Ashkahn: ... something you add to your website that can hep you do this.
Along the same lines of, the whole benefit of this being that you're kind of controlling
as many outside variables as possible, you also want to A/B test very specific things.
>>Graham: Whether or not A/B testing is worth it?
Try running your site without A/B testing- >>Ashkahn: And see how it compares?
>>Graham: Yeah, then run your site with A/B testing and see what did better.
>>Ashkahn: But you don't want to change a bunch of stuff.
Ideally, you're making one tweak and tracking that tweak, and once that's done you're making
another tweak and tracking that tweak.
Otherwise, it's hard to tell what specific thing you did is actually having the impact.
>>Graham: Yeah.
Sometimes you can combine changes per page.
Sometimes you can make different changes on separate pages.
But yeah, the goal of just adjusting one thing at a time.
If you're changing the color of a button and this main call to action, and your nav bar
is different, and then someone takes a totally different action on your page, which was it?
Which one of those changes actually had an impact?
Keeping things one test at a time is the way to go.
So yes, we have run A/B tests.
You can also run A/B tests, I guess it's worth mentioning, not just on your website.
You can A/B test headlines for emails that you send out.
You can A/B test actual content for emails that you send out.
Really, you're just running an experiment.
>>Ashkahn: And you can expand the knowledge, or the concept to your Float center.
I mean, it's hard to do things simultaneously in your Float center but you can try different
membership signage and track how well one is doing.
Although, it just gets looser in the outside world.
>>Graham: Sure.
We've actually done it with Beginner's Guides too.
We were putting beginner's guides at kind of different locations around, with different
discount codes.
The discount codes and amount let us track who redeemed what, so we could actually kind
of tell what locations our beginner's guides were performing better at based on how many
discount codes of that type had been redeemed.
So yeah, you can apply this to many other things, but most commonly, and like the question
kind of implies, it's done on websites.
>>Ashkahn: Yeah.
It's certainly, like, the most easy implementation of it is on the web and you get the most control
over it.
>>Graham: So, totally, we've played around with it.
There's some easy tools out there to make it so it's not so challenging to run these.
A great one that I know of right off the top of my heads is Optimizely.
It's really easy to use.
It's super simple to setup, and it lets you get started with WYSIWYG editor, what you
see is what you get, in order to manipulate the web page and immediately launch variations,
so it's both easy to implement, easy to actually use and make the changes even if you're not
a big techy web person, and it's really easy to view the results and see them.
It does cost a little bit of money.
I think they have a free plan, depending on how many visitors you're getting per month.
But that's a really easy way to start, and then there are other similar tools out there
too.
I think like Visual Website Optimizer is one, and a few other A/B testing tools.
If you just do a quick Google search I promise you they're going to pop up towards the top
because they're A/B testing all kinds of ads for themselves.
>>Ashkahn: In terms of what we're A/B testing, when you're thinking about your website, really,
your main goal is probably going to be to get people to book appointments.
You've got to remember what your goal is to figure out what you want to test to try to
improve that.
So, if you've never done this before, it's good to start with the biggest things.
Like, take your call to action and play with your call to action and see if you can improve
the rate that people are going from your website to your little booking page and trying to
book appointments.
I mean, start with the lowest hanging fruit or the thing that's going to give you the
biggest reward if you manage to find a good tweak to it.
>>Graham: Yeah.
Trying to refine your About Us page that people spend more time reading about you is something
that you can do certainly with A/B testing, but like Ashkahn was saying, that's not going
to lead to the most sales that you can do right out of the gate.
Yeah, calls to action are really good one, testing the size of that, testing how many
other things are on a page.
>>Ashkahn: The words.
"Book Now" versus "Schedule an Appointment" versus "Schedule a Float".
That's definitely something to play with.
>>Graham: Yeah.
Again, like we mentioned it, it's no joke but the color of a button can have a huge
impact on how many people actually click it.
Green versus blue versus red.
>>Ashkahn: Look in to some of this stuff.
This whole world of Neuromarketing is really strange because when you talk about it, it
all sounds so dumb.
Where it's just like, "Yeah, and you know"- >>Graham: I don't think it sounds dumb.
I think it sounds like science.
>>Ashkahn: Well, I don't mean like how it's done.
I just mean the fact that like ... When you look at the stuff-
>>Graham: Oh, that the blue button versus a green button-
>>Ashkahn: ... they're like, "hey if you change this button to say 'Book Now!' with an
exclamation point" that's gonna make 10% more people click it.
And you're like, "no way, how could that possibly be true?"
But there's just all this actual research out there that they've done and it's true.
These subtle cues affect us whether we want them to or not, like the color of things and
the wording and the words like "now" or "free" or stuff like that have an impact on us.
It's a little frightening but it's worth knowing.
It sounds kind of futile at first to be thinking that you're spending a bunch of time changing
"book here" to "book now" but that stuff can make a difference.
>>Graham: Yeah and a good place to find these tests to run, whether again it's simplifying
your page or adding a video or changing the text of something, if you just look up A/B
tests or common A/B tests or great A/B tests to start with, anything like that will pull
up some nice best practices for tests that other people have run across their websites
that have just tended to perform really well.
Fortunately the nice thing is because it's tested and, again it's very scientific, the
results are really easy for people to publish and they're really easy for other people to
grab down and kind of collect.
There really are lists of 20-25 A/B tests where if you've never changed anything on
your website, going through that list of 25 will probably give you some nicer conversions
from your website over to people booking floats.
>>Ashkahn: There's just some heuristics out there when you look at the data, like this
color performs better than that color.
These words performed better.
You can use those as guiding post and try stuff out but at the end of the day those
are just generalized tips.
You may find in the context of your website, maybe they don't act as strongly or something
else is actually different but they're really good places to start and good things to test
for because most of the time they are improvements.
>>Graham: It'll at least give you an idea of what people are doing out there.
If you, after going through examples of how people have run other A/B tests and if you
read a 100 different A/B tests that other companies have written, you really start to
get the point.
You understand what people are aiming for, you understand how they're doing it.
You actually see some of the really cool, inflated numbers you can get as a result of
really small tweaks to your site.
It gets, at least for me, kind of addicting.
Once you start running scientific tests on stuff, you just actually do see, "oh, 5% more
people did this" or "30% more people did this" or didn't drop off from my cart, for example,
like, "oh, 30% more people actually finished their checkout when I changed this word."
Is really cool data and you're like, "oh, this is actually leading to more people hopping
in a float tank" which is kind of cool.
>>Ashkahn: If you have never heard of this stuff before or didn't build a website with
any of this in mind, you're probably gonna find some really obvious ones at the beginning,
like even having a call to action prominent right on your homepage or things like that
may be a nice little bonuses for you.
>>Graham: Yeah, so that's it.
You may have heard the words before, A/B split test, you might not have, but just know that
fortunately technology and the tools available to business owners now are robust enough that
you can get started with it pretty easily and without much technical knowledge.
In my mind there's no reason for people to not start A/B testing and at least running
a test a week or a test a month or something.
These things really add up.
If you're just making little improvements of 2% here, 3% here, not even counting the
really big improvements you can find.
After a year those 1% and 2% and 3% totally add up and collectively actually mean a big
improvement on your bottom line.
>>Ashkahn: I feel like I have a word of warning to balance that out with a little bit.
Which is in my mind is like, there's a few things to consider with A/B testing.
As you start researching it and applying it, there's some stuff that I think is just a
little different for the type of businesses that we're talking about.
One is when you look into this A/B testing research or when you talk about these companies
doing A/B testing, most often you're looking at companies that are humongous web companies.
So there's a couple differences that I like to keep in my mind when I think about Our
Shop versus Facebook or something.
I heard somewhere that Facebook has more variations than there are users for Facebook.
>>Graham: Than there are people on the planet.
>>Ashkahn: Than there are people on the planet.
It's just insane how much stuff like this they do or like I'm sure Amazon does an unbelievable
amount of A/B testing.
>>Graham: Oh for sure.
>>Ashkahn: That's because first of all, they're absolutely humongous businesses so if they
can improve something by like 0.01%, that is a huge chunk of money to improve something
by.
Whereas with the Float Center, moving the needle that small is not really that big of
a difference for us as it is for a company that big.
Also, they're entirely web-based.
That is their focus and the nuance of how their website directs people plays a much
bigger role in their businesses than it does for us, as brick and mortar business.
The third thing that I like to keep in mind is just that the more people you have, the
more numbers going into this, the more significant all the results coming out are.
So it's kind of hard for float centers sometimes when you do an A/B test when you're comparing
60 people on one side versus 62 on the other, when the numbers are that small in terms of
who's hitting something or how many people are actually interacting with something, the
margin of error is just so much more significant.
The fact that two more people did something and that means a decent chunk of percentage
is a little more wishy-washy than when you have 1.5 million people did something and
1.7 million people did another thing.
There's a lot less influence of random chance in there.
I guess in my mind too, I totally think this stuff is worth it and it's good doing this
A/B testing and getting this stuff going but at a certain point if you get deep enough,
I think there's probably other levers in your float center that are gonna be more significant
to pull or are gonna be a bigger influence on how you're spending your time than getting
insanely deep in the world of A/B testing for your brick and mortar float center website.
>>Graham: Yeah, and just to balance that out too.
The nice thing is if you are getting results with just 60 people visiting, those are the
big effect results that you want which is really good.
It might be that when you start out, if you did follow best practices for website design
and stuff like that, you don't see many huge gains and maybe this is just something you
run lightly in the background.
Honestly, there are a lot of websites out there where I would not be surprised if they
started A/B testing and for the next three or four months you really start seeing some
nice traction coming from it.
You can just keep them running as long as you want.
If you start an A/B test and your website traffic isn't very huge, you can just leave
it running in the background for four or five months.
You don't need to immediately follow up on it or it doesn't really matter if you get
results within a week, it just means that's much sooner than you can act on it.
There's nothing wrong with getting things going, just leaving them there, revisiting
them every few months.
>>Ashkahn: Just simmer.
>>Graham: Let them simmer.
Try them on the back burner.
Cool, anyway that's A/B testing in a nutshell.
If you have your own questions, definitely cruise on down to floattanksolutions.com/podcast.
>>Ashkahn: That's right and we'll be here tomorrow again.
So stay tuned.
>>Graham: Bye everyone.
Không có nhận xét nào:
Đăng nhận xét