We are currently living in a world dominated by mobile apps and connected devices. State-of-the-art mobile phones and tablets use apps to organize knowledge and information, control devices, and/or complete transactions via local, web, and cloud services. However, users are challenged to select a suite of apps, from the millions available today, that is right for them. Apps are increasingly differentiated only by the user experience and a few specialized functions; therefore, many apps are needed in order to cover all of the services a specific user needs, and the user is often required to frequently switch between apps to achieve a specific goal. User experience is further limited by the inability of apps to effectively interoperate, since relevant user data are often wholly contained within the app. This limitation significantly undermines the continuous (function) flow across apps to achieve a desired goal. The result is a disjointed user experience requiring app switching and replicating data among apps. With these limitations in mind, it appears as if the current mobile experience is nearing its full potential but failing to leverage the full power of modern mobile devices. In this paper, we present a vision of the future where apps are no longer the dominant customer interaction in the mobile world. The alternative that we propose would “orchestrate” the mobile experience by using a “moment-first” model that would leverage machine learning and data mining to bridge a user's needs across app boundaries, matching context, and knowledge of the user with ideal services and interaction models between the user and device. In this way, apps would be employed at a function level, while the overall user experience would be optimized, by liberating user data outside of the app container and intelligently orchestrating the user experience, to fulfill the needs of the moment. We introduce the concept of a functional entry-point and apply the simple label “FUNN” to it (which was named “FUNC” in (Wang, 2014)). We further discuss how a number of learning models could be utilized in building this relationship between the user, FUNN, and context to enable search, recommendations and presentation of FUNNs through a multi-modal human–machine interface that would better fulfill users' needs. Two examples are showcased to demonstrate how this vision is being implemented in home entertainment and driving scenarios. In conclusion, we envision moving forward into a FUNN-based mobile world with a much more intelligent user experience model. This in turn would offer the opportunity for new relationships and business models between software developers, OS providers, and device manufacturers.