Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement mixpanel a/b testing snippet #1733

Closed
zavreb opened this issue Jan 8, 2018 · 18 comments
Closed

Implement mixpanel a/b testing snippet #1733

zavreb opened this issue Jan 8, 2018 · 18 comments

Comments

@zavreb
Copy link

zavreb commented Jan 8, 2018

see here: https://mixpanel.com/help/reference/ios#tweaks

#import "Mixpanel/MPTweakInline.h"

A/B testing experiments
Prerequisites
Getting started with A/B testing is quick and easy. Just include the latest version of the Mixpanel iOS library in your app and you are ready to start experimenting.

Make sure that you have already:

Included the latest version of the Mixpanel iOS library in your app (v2.5+)
Created an experiment on the A/B Testing tab of the Mixpanel website
To use the UI's visual experiment creator, please ensure that you're in the project appropriate to your app's current build (i.e., Production or Development). While not required, it's a good idea to connect your mobile device to WiFi while using the A/B designer.

Once you have created an experiment and, optionally, decided which users you wish to target, simply turn on the experiment to start serving your A/B test to customers. It is that simple!

Planning to run an experiment on the initial view of your app? It can take several seconds for experiments to be applied on first app open; as a result, we recommend against putting UX changes or developer Tweaks on the first view of your app. If you wish to A/B test on the initial app view you will need to take delivery latency into account. We recommend enabling the option checkForVariantsOnActive (to grab data when the app is opened) and joinExperimentsWithCallback method (to apply the variant data to the view).

Notes on experiment delivery
Mixpanel checks for any new experiments asynchronously on applicationDidBecomeActive. After the response is received, experiment changes and Tweaks are applied or removed where appropriate. To handle network availability, each experiment is cached on the device so they can be re-applied when the API call cannot be successfully made.

If you'd like more control over when this check for new experiments occurs, you can use the checkForVariantsOnActive flag and the joinExperiments or joinExperimentsWithCallback methods to download and apply experiments manually.

The $experiment_started event is fired when a given experiment (both changes and/or Tweaks) is first started on a device. The event will contain an $experiment_id property with the given experiment id which we encourage use within funnels, and our other reports.

A/B developer tweaks
For more complex changes that you want to A/B test, you can include small bits of code in your apps called Tweaks. Tweaks allow you to control variables in your app from your Mixpanel dashboard. For example, you can alter the difficulty of a game, choose different paths through the app, or change text. The possibilities are endless.
Objective-C
To use Tweaks in any Objective-C file in your app, you need to import one file.

#import "Mixpanel/MPTweakInline.h"

@zavreb
Copy link
Author

zavreb commented Jan 18, 2018

May require some research if the snippet doesn't work.

@southerneer
Copy link
Contributor

@thescurry @zavreb implementing Mixpanel as suggested in those linked docs probably doesn't make sense for our project. Basically it assumes a 100% native (Objective C or Swift) app. If A/B testing UI components is a high priority then we need to explore React Native/JS options instead. Happy to discuss further if you need more clarification.

@thescurry
Copy link

@southerneer @zavreb This might be a possible solution: https://github.com/davodesign84/react-native-mixpanel

@southerneer
Copy link
Contributor

southerneer commented Feb 13, 2018

Yep, we already use rn-mixpanel. It's a pretty lightweight wrapper and doesn't include a/b testing integration afaict.

@southerneer
Copy link
Contributor

If you guys ever want to see the list of packages we already use in our project you can check out package.json and look for the "dependencies" section.

@southerneer
Copy link
Contributor

Research done, closing

@zavreb
Copy link
Author

zavreb commented Feb 15, 2018

@southerneer seeing as to how this ticket was closed are there subsequent tickets to describe potential proposals.

Otherwise, if we just close this ticket without doing so, we could "lose" track of this product. It'd be best if this followed our normal pipeline flow instead of just closing the ticket.

@thescurry
Copy link

'bout deez apples? http://blog.nparashuram.com/2017/09/ab-testing-for-react-native-apps-with.html

@thescurry
Copy link

Not really for mixpanel, but another possible option: https://github.com/taplytics/taplytics-react-native

@zavreb zavreb reopened this Feb 15, 2018
@zavreb
Copy link
Author

zavreb commented Feb 15, 2018

Need to determine proposals before closing this ticket.

@southerneer
Copy link
Contributor

Sorry for the early ticket close.

Given the triflingly small star count (6) on the taplytics repo I'm going to vote for the codepush path.

For simple scenarios (change anything UI/React-related and push to a % of users) we can use the existing codepush infrastructure. We can track the changes with mixpanel. The hard part is probably figuring out the best way of organizing everything to extract meaningful data.

More complicated scenarios (like targeted audience tests) would probably require a lot more work. I can't find any halfway decent react-native repos for doing this sort of thing which means that we'd have to develop a custom RN bridge to a native library like the Mixpanel one mentioned in becky's original post.

@zavreb
Copy link
Author

zavreb commented Feb 16, 2018

Yay looks like my really long comment didn't save. Alright try #2.

@thescurry to determine route. I'd say the only prod requirements there are, are:

  1. Anything that can get us to a/b testing as soon as possible
  2. a process that can grow with us long term
  3. should let us see performance of the a/b test (or else there is obviously no point 😞
  4. has a fairly simple and streamlined deployment process for a/b testing

nice-to-have:

  1. product can a/b test without help from dev team (meaning we can run our own tests without having to create tickets on github and having to wait for a build or a codepush deploy)

@southerneer
Copy link
Contributor

One very important thing to note here: we need a lot more users than we currently have for A/B testing to yield statistically significant results.

https://conversionxl.com/blog/ab-testing-statistics/

In other words, we probably need several thousand users before anything beyond the simplest test will show us anything worth acting on.

@thescurry
Copy link

@southerneer working on solving that issue! :)

@zavreb
Copy link
Author

zavreb commented Feb 20, 2018

@thescurry so should we punt on this until we have more users...

I mean technically we only need 5%+ for it to be statistically significant. Lol but I would argue we may not even have enough DAUs or MAUs to really test stuff yet. I vote to Punt this until after our geofence build goes out...

@thescurry
Copy link

@zavreb if you vote punt, I vote punt.

punts

@zavreb zavreb added the PUNT! label Feb 21, 2018
@bengtan
Copy link
Contributor

bengtan commented Feb 22, 2018

Yes, punt.

A/B is for evolutionary improvements. We're still doing revolutionary stuff.

@bengtan
Copy link
Contributor

bengtan commented Jun 18, 2018

Purging due to redesign/board-restructure/out-of-date/etc.

@bengtan bengtan closed this as completed Jun 18, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants