-
-
How We Use Machine Learning and Big Data to Improve the Mobile User Experience

How We Use Machine Learning and Big Data to Improve the Mobile User Experience

Given that 5,000 users downloaded our latest white paper “The UX Mobile in a Lean Agile context”, we would like to exhibit our User Experience Analysis (UX) methodology through Machine Learning and Big Data.

1. A growing need for UX analysis

Competition on the blinds has reversed the trend between apps and users. Back to 2010, Apple announced: “If you are looking for that, there is an app for that”. Nowadays, if you are looking for that, there are dozens of apps for that. By now, the mobile user is more volatile and the pressure is borne by the product owner who assumes the responsibility of designing an inclusive product.

Besides creating a useful material, challenges facing the product owner are therefore multiple in terms of user experience:

  • It must take into consideration the specificities of each mobile device.
  • It must adapt to 2 different types of users: Android and Apple.
  • Users’ habits result in incrementally complex technical specifications (online payment, multimedia, facial recognition, voice synthesis, security). This complexity entails the need for product extreme simplification.
  • An increasingly heterogeneous population of users, in terms of both age and technology mastery.

Since the analysis tools currently on the market are mainly focused on CRM tracking, we have started, for nearly a year now, designing a user experience analysis tool that we use in the apps we develop, to support the product owner in the continuous improvement of his/her product.

Example of a BtoC app: unexpected user profiles

When designing an airport app, the product owner had specified that it should target the frequent traveler who would use it on his/her arrival on site and Business Men / Women who are in a rush. The interface must therefore be designed to quickly display information and highlight features such as automatic Wi-Fi or indoor geolocation of his/her parking space.

Hence, we thought of the app according to these specifications and embedded Azot, our UX analysis tool. Then, we observed that users’ behaviors were not limited to what was intended. Indeed, it appeared that travelers often use the app before their flight to identify the traffic and the approximate time of arrival. Another type of use also emerged: the VTC and the taxi drivers who use the app to know the most filled planes landing times.

Thus, it seems that for the same app, different profiles of users (but especially behaviors) appear, and that they are often distinct from those originally identified. This is how the idea of segmenting a mobile app use typologies emerged.

Example of a senior app: the limits of traditional analysis

AppStud has also designed a mobile app for seniors. For us to be the best, we worked with a UX agency that invited “target” users to its labs.

We designed several digital interfaces and gave them to these future users while asking them to “use the app as you would at home”. These people knew they were filmed, and despite our requests, they read all the interfaces and took some time before choosing which button to press.

We then developed a small device that recorded the navigation videos and retried the experience, this time by transmitting the app to untrained senior users for experimentation. Navigation models were very different and the first interactions took place more quickly. There were many more gestures related to navigation misunderstandings than in the lab-testing phase.

Conclusion : The UX analysis must be run in a “free” context in favor of the user to downward any pressure related to the view of a third stakeholder in the project. 

Hence, the interest in observing (anonymously) the behavior of each of the app users.

It became obvious that the UX analysis has to be done:

  • By successive iterations
  • By segmenting user behaviors
  • In a context where the user is comfortable and does not feel observed

Determining the relevant information to be analyzed

Therefore, we went into creating a self-sustained SaaS software that would provide the following information:

  • Automatic feedback to ask the user the right questions while navigating.
  • Gestures to know the gestural grammar being used on each screen.
  • The accepted permissions to ensure app maximum use.
  • Recording sessions to automatically get “over the shoulder” of users by hiding confidential fields of course.
  • Network to identify app behaviors depending on the network coverage.
  • Phone details (model, screen size, memory, …)

So the objective was to answer the following question:

Do users understand my interface?

Gesture analysis

A lot of work has been done over the past two decades concerning clicks analysis in the context of E-Commerce websites. An article entitled Advances in Clickstream Data Analysis in Marketing published in March 2008 explains the concept of “clickstream” which refers to the analysis of clicks and mouse movements made by users online on a website.

These data help analysts understand the behaviors and choices made by individuals.

In our case, the main difference between a website and a mobile app is the establishment of a new gestural grammar distinct from the web (double tap, long press, swipe, etc …), and a content presentation management strongly compelled by the small size of the display.

Azot aims at orienting this overview towards mobiles instead of the classic web. Indeed, the first literary papers dealing with the web field go back to about twenty years, whereas the mobile platform democratization dates from just over 5 years, which implies that scientific literature on this subject, nowadays, is much less provided. Given that there are considerable technological differences between the two platforms, we are faced today with an urgent need to complete the existing research. Indeed, clicks are substituted by touch / swipes. New interactions have been added: the double tap, the force touch and the long press. A mobile interface no longer includes a mouse movement analysis.

Heat maps

More recent research on the interaction between users and digital interfaces exists and dates back to 2013. In the document Gesture Tracking and Recognition in Touchscreens Usability Testing, the author presents the interest of gesture analysis in the Usability study.

The term “Usability” has various definitions. Thereby, the PN-EN ISO9241 standard defines it as “a product capability to enable the user to accomplish his/her initial purpose effectively and with satisfaction”.

The objective of gesture analysis is to create usability tests. The authors designed a tool that allowed them to generate heat maps based on the locations and number of Man-Interface interaction iterations, and thus to identify an interface most interesting areas.

This functionality has been implemented in Azot and allows to identify interaction zones (therefore interest zones) and to associate them with specific gestures to validate the understanding of the gestural grammar being adopted in this zone.

2. Datamining and Profiling

Datamining interest in improving the UX Mobile

Studies have shown that an interface that is understood by the user has a 20% additional chance to lead the user to achieve the goal set by the product owner.

The main issue lies in the fact that, traditionally, determining the users’ profiles is done by handing them an initial form containing questions. Yet, a form on a mobile interface is anti-user-experience due to the very nature of the device (small screen), the form (a significant number of questions to determine a specific profile) and the mobile users themselves (in a hurry).

The research paper Mining and Analysis of Clickstream Patterns (2009) emphasizes the value of segmentation through an intelligent algorithm of user pockets according to their different usage behaviors on web pages by retrieving the server logs. In a mobile context, this method would upgrade the user experience through techniques such as morphing (interface modification on the fly).

In our case, the main challenge was to identify user profiles without explicitly questioning them.

Implementing Datamining in Azot Data

Using statistical methods to analyze these data enables to emancipate preconceived ideas that naturally form during an app design and implementation. Indeed, the different actors involved in the app development inevitably look at their app subjectively. They have designed it to meet a number of well-defined needs, but the user’s behavior includes a great deal of unpredictability, which can lead to a misinterpretation of the data reported by Azot. On the contrary, the information being highlighted by statistical tools are independent of any prejudice due to their very nature.

However, simple descriptive statistics may not always be sufficient to break these stereotypes. It is therefore necessary to opt for more advanced methods.

Approach taken

Let us note that our goal is to assess the user experience quality. This analysis is usually done based on the users’ feedback, or studies where users are invited to test the app while observing their reactions. However, these studies are costly and are often biased due to the fact that (1) all user profiles are not called upon to testify, (2) or the testimonials are made after surfing. Hence, observing (anonymously) the app users’ behaviors separately is highly recommended.

With this objective in mind, an interesting statistical method is clustering. It is all about grouping elements into different lists according to their similarity with each other. In our case, we are interested in how users react to our app, so we have to group them according to the way they use it. To that end, we create groups based on the page sequences visited when using the app (a session).

The advantage that clustering brings lies in the fact that it is an “unsupervised” method, i.e. no human intervention is needed (unlike other methods that require having upstream already sorted data). Thus, it is actually possible to automatically get rid of all stereotypes in order to identify the different users’ behaviors.

However, the sessions sent by Azot contain information that go beyond the pages visited. It is therefore also interesting to put these information in perspective of the different “navigation patterns” which we have previously obtained. There is another statistical method (Multiple Correspondences Factorial Analysis) and hypotheses statistical tests that make it possible to extract the different characteristics of each users group. This can help the app designer see for example whether his/her app is easy to use depending on the phone model, etc.…

Results regarding the airport app

Based on sessions performed for one week concerning this app and reported by Azot, we can draw the different sequences of pages visited and classify them according to their similarity. Moreover, once we obtain these sessions groups, we can observe the different characteristics of each group to get this type of information:

As can be seen in the example above, we end up having a huge number of information that will enable us to better understand the users’ behaviors. Thus, we can identify 3 different ways of using the app (there are actually more classes, but the following are enough to illustrate the example):

  • Class 0, which is the most preponderant (about 60% of sessions), represents users who use the app to track their flight details. This is the behavior envisioned during the app design, with some modifications (for example using the HOME and WIFI pages is not part of it).
  • Class 1 (nearly 30% of sessions) refers to people who use the HOME page almost exclusively, for example to check waiting times at security checks. This class also includes people who occasionally use another page, such as TRAFICROUTIER. It highlights a very specific use of the app: in almost 1/3 of the cases, the user visits only one page. Therefore, he/ she seeks a specific information and overlooks the rest.
  • Finally, class 2 (about 7% of sessions) is typical of the first connections to the app: browsing tutorial pages, creating an account (LOGIN page) and viewing a few pages. It can be observed that these sessions are generally lengthy and many pages are being visited. This highlights a specific users’ behavior when they first login to explore the app, which could indicate, for example, that the tutorial is not enough to let users fully discover the app.

Navigation adjustment

The ultimate goal would be to use these information to adjust navigation to the different users. Indeed, for a given user, once his/her typical navigation pattern and the pages he/she tends to use are identified, it would then be possible to customize the homepage so as to allow him/her to directly access the services he/she is used to, which naturally leads to a better user experience.

In the previous case, for the navigation pattern Home > Flights_Departure> FlightDetail> Flights_Departure, we could force the app to “boot” on the Flights_Departure page, by not going through the Home page and thus reducing the number of interactions between the user and the interface.

In our case, we avoid a particular screen, but in an E-Commerce app with a significant number of screens, it is easily feasible to extend this mode of operation to the categories of products most often consulted in order to automatically have users attracted to a particular category.

Limits and future research lines

Preliminary results indicate that the operative part is good. Azot is being integrated into several E-Commerce apps. At this level, we are confronted with two major issues:

  •  How to know whether the same user is using the device to view the app? For example, a family that uses a joint tablet.
  •  How to recognize the same user within different devices? For example, the same user who deploys the same app from different smartphones or tablets. This problem is more easily circumvented than the previous one, given the possibility of “logging” a behavior to a user account. It is more difficult when the app requires no connection.

Research lines

As an initial step, we wanted to validate the operation on a particular metric which is the “navigation patterns”.

A first research line studied now consists of addressing the other metrics that we retrieve (gestures and heat zones) in order to expand the behavioral segmentation.

Want to know more? Contact us!

You have an idea in mind, Appstud helps you develop it!

Product thinking • Design • Development • Acquisition • Product evolution

Written by

Related articles

Do you know Prismic.io? Prismic is a very handy tool.
1,919
Material Design: is it the Future of Mobile Design by Google? We tell you everything about this!
2,343
Do you know what Instant Apps are? One in four apps is never used. To counter this problem, Google has created Instant Apps.
1,793