Friday, May 25, 2018

Google I/O '18 Developer News



Disclaimer: The following content is based on personal recollections and interpretations and may therefore be incomplete or faulty. Attendance was funded by me as a private individual so views and opinions do not reflect those of my employer. Content licensed under Creative Commons (CC BY) Casper Bang.


A few days ago I wrote about the Google I/O 2018 keynote which is not particular technical and is primarily addressing the general tech public (managers, journalists and the like). Most of I/O revolves around much deeper content of course, split between technical talks, office hours, sandboxes, expositions etc. and below I will try to summarize on this - if you are NOT a developer, this may not be for you.

Android JetPack

This was a brand new term being presented to us at the Developer Keynote on day one, as there weren't even hints of this in the Google I/O app nor on the schedule. While initially somewhat confusing, after having spent 3 days with sessions and talking to Google employees, it is now more clear to me. Android JetPack is an overall umbrella covering libraries, tools and architectural guidance to help building best-practice Android apps. It addresses the following aspects.

Support-library rebranding and refactoring

The Support library is an essential part of most Android projects, but its organization and versioning is somewhat complex. Google is now splitting the support library up in smaller discrete parts and under the new packagename androidx.* rather than android.*, to make it clear what's bundled with the Android OS and what's bundled with the app. Apart from the change in package name and more fine grained structure, the library versions have been reset from 28.0.0 to 1.0.0. This is a welcomed cleanup and it seems like the tooling in the latest Android Studio 3.2 Canary build 14 is already offering great support for this migration.

Android Architecture Components

Google never had opinions on the actual architecture and structure of an application (you probably know this rather famous response by Dianne Hackborn about the subject), which is great for freedom but not so great for consistency across apps or figuring out best-practice. At last years Google I/O the Android Architecture Components launched to address some of these issues, incl. reactive programming between View and ViewModels for a somewhat MVVM looking architecture. This year Google takes this a step further, under the JetPack umbrella.
Navigation and backstack management
One of the major issues on Android remains flow and navigation within the app, since you are forced to think hard about the wiring and transitions between Activities and Fragments - done exclusively by code and without any overview or insight into these navigation flows afterwards. A closely associated nuisance is dealing with the backstack which is also often quite a headache - especially if you have UX designers focusing on iOS, who tends to forget that on Android, by default, you can always go back to a previous screen (be it Fragment or Activity).


With JetPack, Google is now bringing navigation front and center, by getting rid of the need of doing manual transitions in code altogether. This is achieved by adding support for a Navigation Component in the libraries and Android Studio as well as advocating for a single-Activity architecture.
This latter point is a clear departure from earlier recommendations, and now it's recommended to only use several activities if you have very specific deep-linking needs and point-of-entry control.
You can read up on this new navigation component here. It's in alpha so while early days, it seems a good idea to adopt this for new apps - you are not likely to be able to retrofit an existing app particular easily. As a practicing Android developer this is one of the most noteworthy announcements at this years Google I/O, simply because it's likely to have a profound implication on the development and maintenance of apps in the future.
WorkManager
We've had AlarmManager for scheduling work on Android since day one. This was replaced in Lollipop by JobScheduler which added a much more powerful abstraction, catering to more aspects than just time. To confuse the matter further, we also got to mess around with SyncAdapters, Firebase JobDispatcher etc.


Now we're getting a replacement for it all in the way of WorkManager, which does not require Google Services to be installed on the device and which offers an even richer API than JobScheduler, based on constraints you specify for when and how you want the work done. WorkManager gains features making it support other Architecture Components constructs, namely LiveData. JobScheduler is very robust and not likely to go anywhere soon, but it's hard to see a reason not to use WorkManager on all future Android work. Like much of JetPack, WorkManager is also an alpha release.
Paging
Whenever you need to load and show lists of just about anything on Android, you're best of using the RecyclerView - a versatile and efficient component introduced with Lollipop. With the increased popularity of reactive programming (internal LiveData and external RxJava) and the introduction of the Room ORM in Android Architecture Components last year, we're now getting a Paging library, to address the paging aspect both in terms of performance and (network) error handling. This is a welcomed addition to Architecture Components, because developers were often stuck trying to figuring out the glue between view, local database and remote resource - and optimal synchronization between these. Interestingly enough, the Paging library is not an alpha release but already marked as stable.

App Bundles & Dynamic Delivery

Over the years app's have increased in size and complexity. While not quite as bad on Android as on iOS, larger download sizes means more users sticking to older versions, sub-optimal user experience as well as increased data-usage.

Google have always used the so-called "split" mechanism, to split APK's on dimensions such as density (DPI), architecture (ABI's) etc. automatically or as instructed by you. The best example of this is probably the fact that there are a whopping 31 variants of the Google Support Library version 12.6.85 - should you choose to go after the raw APK rather than having your device or Android Studio handle this for you. The main reason for this is of course to ensure that the user only gets the stuff he actually needs - since there's no reason to include very high density graphics for a low-grade phone unable to render this anyway.

Well now Google is taking this to the next level, by introducing an improvement to this split mechanism, known as App Bundles which makes use of Dynamic Feature Modules. What this means is that you will be able to modularize your app into key features catering to casual users who just needs basic functionality as well as hardcore users taking advantage of every corner of your app. Facebook and its Messenger comes to mind here, but really this mechanism seems to be applicable to solid design principles for just about any app and as a fan of microarchitecture/plugin designs this sounds pretty awesome.




The Dynamic Feature mechanism works on Android L (Android 5.0/API level 21) and later - for older, you decide whether to bundle the feature in or not. While not quite ready yet, Google says it's their intention to make Instant App's and Dynamic Feature Modules work hand in hand - so in the future perhaps we may not be installing an app at all, but simply follow a link with basic bootstrapping functionality using Instant App where the rest is added on-demand using Dynamic Feature Modules!

Slices

At first sight, this smells a bit of Instant Apps and App Actions but it adresses an entirely new problem. This is an interesting response to the problem of people installing your app, but never using it again because they forget its there or don't know about its features.


Imagine if you could provide small parts of your app to people making searches using the Google Launcher, a Google search or the Google Assistant, without them actually having to launch your app directly at all. That's the very interesting promise of Slices! A "slice" is a way to surface your app's UI inside another app - or as one Google engineer explained it, Google's new approach to remote content. To understand this, it's probably easier with a few examples.
The first example "Navigation", shows how Lyft (a service like Uber but with better ethics) is able to promote itself as a provider of a lift home or to work, with a car arriving in 4 minutes while showing the price - all from just from having typed in "get a ride" as a search term! The second example shows "Task Completion", how the Android OS itself (starting from P) is indexed in order to integrate with Google search. The last example shows "Recall & Discovery", where content within your apps can be indexed and found.

I didn't really see this coming, but it's interesting at least on one front; it's going to provide a uniform interface for uniting similar but competitive services! After all, it's not hard to image, in a few years, we won't even launch an app directly anymore - we'll just express our intent, choose a provider (based on some very rich criteria) and only then decide who we should ride with (Taxi, Bus, Train, Waze, Uber, Lyft etc.). Want to listen to a particular song or watch a special movie? Just search for it and let the aggregating providers fight among themselves to bring you a hit; gone will be the days where you need to consult multiple apps manually one by one.

Slices is in alpha, expected to reach stable by the launch of Android P and backportet to older versions as well.

Kotlin Extensions

Google remains committed to Kotlin, evident not only by the fact that most talks with slides used Kotlin rather than Java, but also because they keep adding features for making the development experience superior. The Kotlin Extensions which we already know and lovefor i.e. its view binding mechanism, is getting a boost by some new extensions, most noticeably some to support working with the new Navigation features I mentioned earlier. These appear to follow the same fine-grained break-down model as the Support Library refactoring, where you can pick and choose what you need. These extensions are all in an alpha release.

D8

D8 is the new code desugaring tool (removal of syntactical sugar from the compiler frontend). Previous to Android Studio 3.2 it had to be enabled specifically by setting a flag:
android.enableD8=true
Since Android Studio 3.2 it is now enabled by default, yielding better performance.

R8

The minification tool ProGuard has served us well for many years, but now Google is taking matters into their own hands by introducing R8 (Reducer for Java 8?). It probably has to do with the latest D8 desugaring-tool and the fact that, in order to obtain better build speeds and smaller APK/AAR sizes, ProGuard is no longer working on the right abstraction layer. Thankfully, the R8 tool is fully compatible with the ProGuard DSL rule language so there should be no changes from a user point of view. Google made R8 open source so there's also no real licensing differences except swapping out GPL2 for a BSD-style license. Unlike D8, R8 is not enabled by default in Android Studio 3.2 so you have to explicitly turn this on by setting the following flag:
android.enableR8=true
Interestingly to me as a dane, R8 appears to be at least partially developed by Google Denmark judging by the active committers Mads Ager, Søren Gjesse and Christoffer Adamsen.

ConstraintLayout

If you're a seasoned Java developer remembering Swing, you will now have seen two races of "layout manager done right". In Swing we eventually ended up with GroupLayout and IDE support in the form of Matisse. On Android, we seem to now have ended up with ConstraintLayout - a truly one-size-fits-all solution, which allows flexibility, avoids deeply nested layouts (this slow) while still being possible to grasp.

Version 1.1 was just released, but at Google I/O we got to see version 2.0. This version adds more attributes for populating the layout-editor a design time with representative data. More interestingly, there is now also support for animation using decorators, supported by a motion design tool within the IDE - which is sure to make adding custom animation work much simpler and faster.

ConstraintLayout 2.0 is not released yet, but is said to be released soon - it's unclear in which state (alpha, beta or final). What is abundantly clear however, is that one better get up to speed on ConstraintLayout as the default go-to layout manager for Android development.

ML Kit for Firebase

So this one is not Android related - not directly anyway, since it applies to cloud and iOS as well. As someone who spent time doing Machine Learning with TensorFlow Lite, I can attest to the need of making Machine Learning tools more approachable - alas, Google is now adding such an abstraction on top of Firebase!
ML Kit for Firebase is a new SDK that incorporates years of Google’s work on machine learning into a Firebase package that mobile app developers on both iOS and Android can use to enhance their apps. ML Kit offers both on-device and Cloud APIs. The on-device API processes data without a network connection whereas the Cloud APIs use Google Cloud Platform to process data for more accuracy.Seasoned ML developers can still deploy their own custom TensorFlow Lite models which you won't even have to bundle with your app (served lazily by Firebase) but Machine Learning noobs like myself will probably mostly have a go at the 5 ready-made Cloud services for doing Text recognization, Face detection, Barcode scanning, Image labelling and Landmark recognition. Google said they will soon add a Cloud service for Smart Reply (basically export the Gmail Smart Compose feature) as well as a very impressive looking face contour tracker for augmented reality etc.
The on-device API is free to use while the cloud-based service API follow the common Firebase cloud API pricing.

ML Kit for Firebase is currently a beta feature already visible in your Firebase Console and you can read more about it here.

Firebase TestLab for iOS

Staying with the Firebase theme a bit, TestLab now gained support for testing iOS apps! This is a huge step for Firebase to be taken seriously as an overarching support stack for mobile development, covering more then just Android.
While for Android we continue using UiAutomator and Espresso tests, you need to use XCTest for iOS apps. The Free plan provides a daily quota of 5 physical and 10 virtual device test executions. The Blaze plan (which I use) is pay-as-you go much like BigQuery.

Material Design 2.0

With the upcoming Android P rumoured to be another iteration on Google's Material Design language, causing Google to do a relaunch of their website material.io, I'm going to take the liberty of referring to this as Material Design 2.0.




I interpret Material Design 2.0 as relaxing the original Material Design guidelines, which could be seen as quite rigorous and thus not favorable to custom branding/theming. This now seems to be addressed by means of Material Theming. As an example of this relaxation, FAB (Floating Action Bar) buttons are now customizable so that they do not necessarily need to sit in just one place and they also don't necessarily need to be round - i.e. if your brand caters more to a diamond shape.
It's is a welcomed touch; as a native app developer believing the best app comes from developing "to the OS" rather than merely "on the OS", I often struggle with designers trying to fit their iOS design onto Android. What has been known to happen, is that designers will launch YouTube, Google Photos, LinkedIn and other behemoth apps with 100.000.000+ users and use the observed legacy UX idioms of these apps as a baseline - rarely aligned with the latest and greatest of Material Design. Speaking of designers; they are now getting a plugin for Sketch to make their work easier.

While things may improve here for theming and branding, Google is still adhering to their Materal Design Components as a base of usability. These components are actually manifested in 5 separate libraries, since there is support for Android, iOS, Flutter, Web and React. All libraries are open-source and live on a public GitHub site under an Apache 2 license. It's Google's vision that you should not need to build UI using custom components unless you have a really good reason to do so. As a developer favoring component reuse, I'm a big fan of this way of thinking.

The most noteworthy new component to me is the bottom app bars with a centered Floating Action Button sitting a bit lower, and integrated into a bottom bar.
It's a nice touch, and unlike the Bottom Navigation bar, the Bottom App Bar should be used for task-based flows rather than app-wide content navigation. The retake on Material Design can already be experienced in the Google I/O 2018 app, Google Pay, Google Tasks and Gmail - but other teams inside Google are said to be updating their design accordingly soon.

Wrap Up

All in all, there's much to be excited about here - arguably more so than was hintet at the Keynote. As an Android developer, JetPack - Android Architecture Components - Navigation support stands out. Google have been listening to developers struggling with finding a best-practice app architecture and are now finally responding to this need. Hurray!

App Bundles and the Dynamic Feature Modules have the potential to shake things up quite a bit on the long run - along with Slices. It's hard to predict the intersection of these technologies but it's clear that Google is taking a "lets throw stuff up on the wall and see what sticks" approach, leaving nothing unexplored. It strikes me that it's fairly important to keep an eye on these features in the coming year as we see them mature and used in practice.

The ML Kit for Firebase also looks really interesting, even if I had promised myself never again to burn 2 days with work that I did not understand nor had reasonable expectations to ever have the time for. ML Kit is sure to lower the bar from native TensorFlow.

For some reason, none of the talks I went to made use of the in-audience placed microphones, so unlike last year, it was more difficult to get to ask the guro's - and the chaotic office hours didn't work out for me this time either. I really think Google needs to address this aspect, if nothing else, add a moderated "Q&A discussion" section to the I/O app so that vague sessions or follow-up questions could be handled better. For me, I would still love to know why so many of Google's own apps appears to violate their own Material Design guidelines by showing 4 icons with label in a bottom navigation component - something requiring custom hacking to achieve. Google+, Google Photos, YouTube, Google News are all some pretty major app examples of this - which makes it hard to convince designers, who will look at Google's apps before reading technical guidelines.

UPDATE: It seems that Google have updated the Bottom Navigation component and associated design guidelines to now support a more flexible policy. This is possible if you use version "28.0.0-alpha1" of the support library along with "android-P" as compileSdkVersion.

Google is really snappy at getting conference coverage out on YouTube, so you can already watch the 192 technical talks there - if you can convince your boss to spend time on it. Also remember, you can play around with many of the new features highlighted above, by going to Google's CodeLab pages which is definitely path-of-least-resistance for learning this new stuff. I certainly plan to do this as soon as I get rid og jetlag and have caught up with my day job. :)
Google I/O is truly a festival for geeks - simply unmatched by any other conference!

Friday, March 9, 2018

DI on Android (Kotlin) using an SPI

Most developers are intimately familiar with what an API is. However, mention SPI, and many will look like a question mark. This blog entry explains what it is, discusses its merits/shortcomings and shows how it can serve as a simple Inversion of Control or Dependency Injection pattern on Android for providing testing mocks.

So what is an SPI anyway?

The term SPI is an abbreviation of Service Provider Interface, and it's nothing more than a contract (interface) with pluggable implementations (classes) discovered lazily at runtime. I have also seen this pattern referred to as Service Locator and Service Registry. It likely has a older history than I am aware of (you can use it if you have introspection and reflection available in your language) but I made its acquaintance with Java 1.3 which introduced these service providers as a way of facilitating loosely coupled modules. In Java 6 the pattern was embraced further by the addition of the SericeLoader class and whenever you use an XML parser, a JSON parserspecial character sets, an encryption cipher, an image decoder etc. on the JVM you likely went through an SPI to get to it.

How does it work?

I was going to explain how it works in detail but it occurred to me, that the online Android documentation for ServiceLoader does it better than I possibly could! So let me just sum it up; all you do is:
  1. Declare that you implement a given interface by providing a /META-INF/services/INTERFACE_NAME file with the content pointing to the class implementing the interface.
  2. In your app at runtime, you use ServiceLoader to obtain an instance to the implementation (note that there could be more than one).
  3. Use Gradle to configure the implementations

I actually think of this as Dependency Injection and it was a thing long before people started writing complex libraries for achieving IoC/DI (Spring, Seam, Dagger, Guice, EJB3, Koin etc.). The beauty of this pattern is that you do not need a dependency injection container, you do not need a mocking library and there is no configuration beyond what you include in your classpath. It obviously has limitations over the full-fledged DI containers and it also suffers from the problem of being dynamic - as in, there is no compile-time verification when you assemble your application and there is a theoretical runtime overhead of doing discovery and class-loading. As you have probably guessed, I am a fan of Inversion of Control but not necessarily a fan of the many Dependency Injection manifestations that pop up; they all start out wanting to solve a simple problem but usually turn out to become just another layer of indirection one has to comprehend.

Android and Kotlin specifics

I link to the Android documentation for ServiceLoader above, so that obviously means the patterns also works on the Android platform. For it to work for an Android ARchive module (AAR), you need to create a /resources folder (New -> Folder -> Java Resources Folder inside Android Studio) and in here place your service registration file /META-INF/services/INTERFACE_NAME.


You can also use plain vanilla Java JAR's to the same effect. Since Java and Kotlin interoperates so wonderfully, things works equally well using Kotlin!

Dependency Injection through the classpath

I use this mechanism in order to be able to swap out a real (remote) web-service repository with a fake/mock (local) one, so that my unit-tests can run super fast. Depending on your needs, you may also have your functional/UI tests use this fake/mock - or you could even provide a 3'rd implementation - all through the power of Gradle.


If you are a commuter like I am, with flaky Internet connection on the train, this also gives you a way of remaining productive during your transit to/from work. You may do this simply by selecting a Gradle buildtype.


Of course, Android also allows you to use flavors if you makes more sense to your project requirements.



As I mention further up and as should be immediate apparent from reading the documentation for ServiceLoader, you could have multiple implementation of a given interface. Sometimes you really just need one so you grab the first you can get while other times you go through all. At some point, you are likely to require a chain, order or priority, in which case you may simply add a method to your interface and let the various implementations specify their relative order - akin to how you work with z-order for a UI.

It's not rocket science and it has its limitations, but if all you are after is loose coupling and a way to work with multiple implementation of something (in my world this oftes comes up in testing), then the build-in SPI approach is a really simple and powerful mechanism. I am surprised more people don't know about it but resort to much more complex solutions when its really overkill - let me know in the comments what you think. 



Monday, November 27, 2017

Klaphatte til app brugere

This entry is in Danish, as it contains quotes in this language which can not readily be translated without loss of meaning.

Som app udvikler, må man være parat til at modtage en del flak (læs: beskydning med spredhagl fra folk der ikke ved bedre men brokker sig i øst og vest) og det er jo et relativ kendt fænomen der er skrevet om utallige steder som f.eks. her. Der sker bare noget med folk når de i relativ anonymitet, får lov til at udtale sig og bedømme på et meget spinkelt grundlag - pludselig er de eksperter og kunne lave det meget bedre selv.

Eksempel på en klaphat
I dag modtog jeg f.eks. følgende review fra Jesper (fulde navn og email bekendt af redaktionen) som jeg, som så ofte før, besvarer inde på Google Play.



Jesper er jo et klassiske eksempel på en fejl-informeret bruger med en inkompatibel tlf. der ikke helt har brugt tid på at undersøge sagen nærmere. Fred være med det tænker jeg, NFC formater er også et tricky emne for alm. mennesker at forholde sig til. Men så modtager jeg en opfølgende email:

Min telefon har en NFC chip. Problemet er at appen ikke er sat op til alle nyere telefoner og kun meget få udvalgte. Det er jo idioti. Måske jeres app ville få en anelse bedre rating hvis den nu var udviklet ordentligt..

Ok, manden har tydeligvis ikke forstået noget af det jeg skriver hverken på Google Play, på min blog eller i svaret til hans review. Tilmed indgår nu også nedladende ord som "idioti" ligesom app'en ikke er "udviklet ordentligt". Jeg bliver lettere irriteret over tonen, og skriver tilbage igen, denne gang på hans eget sprog:

Hej Jesper,
Heldigvis er app'en en af de bedst ratede Rejsekort app's på Google Play. Ratingen ville dog være endnu bedre, hvis det ikke var for klaphatte som dig der tror de ved bedre end de gør. Hvis du vil vide lidt om hvordan det i virkeligheden forholder sig, så kan du læse hvad jeg skriver om sagen her:
http://blog.bangbits.com/2016/05/rejsekort-nfc-og-smartphone.html
For at opsummere; nej NFC er ikke bare NFC, der finder mange variationer og Rejsekort A/S benytter sig af Mifare som desværre ikke alle NFC chip understøtter. App'en er sat op til altid at tillade installation af Rejsekortscanner på nye tlf, kun hvis jeg får tilbagemeldinger fra brugere, har jeg mulighed for at filtrere fra i Google Play. De fleste mennesker er flinke og forstående for denne udfordring som jeg ikke selv er herre over, andre tror de ved bedre og giver dårlig rating selv om det drejer sig om en gratis app der hjælper 20.000 mennesker til daglig. Du bestemmer naturligvis selv hvilken kategori du ønsker at tilhøre. :)

Her synes jeg ligesom jeg gør det synligt hvilket tyndt grundlag Jesper egentlig befinder sig på ligesom jeg giver ham en mulighed for at bekende sig til den kategori at personer der får lært noget nyt og indrømmer det - det gør vi alle jo fra tid til anden. Men nej...

Kalder du mig for en klaphat..?? Nu er det jo ikke mig der har udviklet noget der ikke virker. Det er jo dig. Så klaphatten må jo være dig selv. Måske du kunne finde ud af at udvikle en converter, så app'en kan bruges på endnu flere. Hvis det kun er 20000 ud af 2 mio. Rejsekortbrugere må man jo sige at du laaaaaaaaaangt fra har gjort dit arbejde godt nok. Men fint med mig at du ikke kan tåle ærlig og konstruktiv kritik og at du bare bliver modbydelig og ond. Men det er fint at du har givet mig det på skrift, så jeg kan videregive til én af mine rigtig gode venner der er journalist og som hader sådan nogle mennesker som bare sviner sine kunder eller måske kommende kunder til. Det er i alle fald ikke det bedste udgangspunkt at sætte sig selv i. Der er noget der hedder at kunden altid har ret. Måske du skulle tænke lidt mere over det..
Jeg ved ikke hvad der får Jesper til at opfatte sig selv som kunde (de betaler normalt for noget) for han er allerhøjst bruger af en app jeg gratis stiller til rådighed. Nuvel, jeg overvejer at ignorere klaphatten men vælger alligevel at forsøge at forklare ham hvorfor jeg ikke mener hans overfladiske og usaglige kritik er speciel konstruktiv:

Jesper,
Fakta er, at jeg skriver højt og tydeligt i Google Play følgende:
"Hvis app'en ikke virker på din smartphone p.g.a. manglende hardware support, undlad venligst at give negativ feedback, da det reelt er telefonens og ikke programmets skyld! Jeg hører dog gerne fra dig, således at jeg kan opdatere listen og undgå situationen for andre. :)"
Men det har du tydeligvis ikke taget dig tid til at læse, men brokker dig med en meget negativ tone som om du har forstand på hvad du taler om, og det har du tydeligvis ikke hvorfor jeg skriver direkte til dig omkring årsagen til at app'en ikke virker på din tlf. og hvorfor den ej heller nogensinde kan komme til det. Jeg linker også til en artikel på min blog der i årevis har beskrevet situationen, som du nemt kunne have fundet hvis du havde gjort dig den ulejlighed at søge på Google engang.
Dét du kommer med er ikke konstruktiv kritik, af de årsager jeg nævner ovenover - det er ikke noget at gøre, du er nødt til at købe din en anden tlf. hvis du vil bruge app'en! Desværre fortsætter du din dårlige stil, hvor du overhovedet ikke forholder dig til hvad jeg forklarer (har du overhovedet læst det?) og jeg kan derfor naturligvis ikke bruge mere tid på dig, da du virker uden for pædagogisk rækkevidde. Jeg ved ikke hvad du tror du får ud af din journalist trussel, men det vil jeg da i givet fald glæde mig til at se - du vil nemlig primært udstille dig selv som noget af en klaphat.
Jeg skulle nok bare have fulgt min intuition og ignoreret ham, for snart efter modtager jeg følgende:

Casper klaphat...
Spadser-Casper...
Mongol-Casper...
Casper Knold...
Casper Papkasse...
Lorte-Casper...
Casper Jubel-Idiot...
Koka-Casper...
Ja jeg kunne blive ved. Det er faktisk meget sjovt at du tror at du forstand på udvikling af apps.. Jeg ved tilfældigvis at det kan lade sig gøre at konvertere eksempelvis en bluetooth-version og dermed ville det også, hvis man altså er dygtig og intelligent nok til det, være muligt at udvikle en konverter til en NFC chip..
Men det formår du nok ikke med din mikroskopiske fuglehjerne.. Jeg tænker bare: Du kan jo ikke formulere dig korrekt, ej heller stave, og ej heller argumentere godt nok for din viden. Og jo jeg har set din blog, hvilken der ikke er basis for at prale af.. Den ligner noget min 8-årige nevø kunne lave meget bedre. Han kunne i alle fald have stavet, formuleret og konstrueret den bedre end du har formået.
Men jeg takker for din venlighed og for din uduelighed. Der er jo tydeligvis ingen grænser for hvad jeg kan tillade mig, med den måde du opfører dig på. Nu har jeg jo ikke tidligere kaldt dig alle mulige mærkelige ting, selvom du allerede den første gang du svarede på min forespørgsel svinede mig til.
Jeg tror ikke jeg behøver at kommentere på ovenstående, det taler vist for sig selv.

Konklusion
Man kan ikke gøre alle tilfredse og nogen mennesker forbliver bare uden for pædagogisk rækkevidde. Den tid jeg har brugt på klaphatten Jesper kunne jeg have brugt på min familie eller noget reelt arbejde. Så lektionen for i dag må være, pas på klaphatte hvis eneste formål er at stjæle din tid!

For at få bare en lille smule ud af den tid jeg har brugt på at forsøge at trænge igennem til Jesper, har jeg foreviget den uredigerede dialog i dette blog-entry - så har klaphatten også noget på skrift til hans journalist ven. :)


Thursday, April 20, 2017

Android NFC radio control using instrumentation

I have always worked a lot with NFC on Android. For this reason, I tend to favor real devices over emulators, since missing an NFC radio means there's no way to truly test the intricacies of radio communication. Unfortunately, one can not power cycle the NFC radio using any official API unless going through hoops and using rooted devices, so ensuring NFC radio power state during testing is an uphill battle. For instrumented test scenarios however, there is actually a way forward.

UIAutomator to the rescue

While not as elegant as using an API, we can launch the settings screen for NFC and manipulate it through the use of instrumentation. This is NOT possible using modern Espresso which limits you to the app under test, but thankfully the UIAutomator framework is still available. The accompanying UIAutomator Viewer tool (which has now moved to sdk/tools/bin/uiautomatorviewer) is a great asset in this regard as it helps us identify the widget we need to manipulate.


What the NFC toggle button is named is not consistent across devices and versions of the operating system, so we have to get a bit heuristic here. In practice, looking through my some 10 devices with various versions of Android using various custom skinning, I have identified 3 unique resourceId's for the toggle button. These are com.android.settings:id/switch_widget, android:id/switchWidget and android:id/switch_widget. Unfortunately, on Android 7 (for Huawei devices anyway) it seems as if launching the ACTION_NFC_SETTINGS intent will not actually get you to where you want, but requires an additional navigational step. This complicates the code a bit but it's still possible to make it work.

To launch the Settings activity prior to any Activity under test, we need to pass along the Intent.FLAG_ACTIVITY_NEW_TASK flag. From there, we can write our logic to help us toggle NFC state.

    private void toggleNfc(final Context context) {

        final Intent intent = new Intent(Settings.ACTION_NFC_SETTINGS);
        intent.setFlags(Intent.FLAG_ACTIVITY_NEW_TASK);
        context.startActivity(intent);

        final UiDevice device = UiDevice.getInstance(InstrumentationRegistry.getInstrumentation());

        findAndToggleNfcInUI(device);
    }

    private void findAndToggleNfcInUI(final UiDevice device) {

        final UiObject toggleButton = device.findObject(new UiSelector()
                .resourceIdMatches("(com.android.settings:id/switch_widget|android:id/switchWidget|android:id/switch_widget)"));

        try{
            toggleButton.click();
            device.pressBack();
            return;
        }catch(UiObjectNotFoundException e){
            UiObject2 nfcMenuItem = device.findObject(By.textContains("NFC"));

            // Move up in the view hierachy until we're at a clickable item
            while(!nfcMenuItem.isClickable()){
                nfcMenuItem = nfcMenuItem.getParent();
            }

            // Issue click to navigate into menu
            nfcMenuItem.click();

            // Wait for any UI jitter to settle
            getInstrumentation().waitForIdleSync();

            // Try to toggle NFC button using this new child activity
            findAndToggleNfcInUI(device);
        }
    }


Composable test aspect using a JUnit rule

The code above is fine and dandy, but I'm a big proponent of composable and reusable aspects, so lets take advantage of the fact, that we can encapsulate the functionality nicely using JUnit's rule mechanism. If you're new to these you may read up on them here. The resulting NfcStateRule.class can be seen below.

/**
 * JUnit test rule for controlling NFC radio power state. Useful in order to ensure NFC is
 * enabled or disabled prior to executing a test.
 */
public class NfcStateRule implements TestRule {

    private static final String NFC_TOGGLE_WIDGET_RESOURCEIDS =
            "(com.android.settings:id/switch_widget|android:id/switchWidget|android:id/switch_widget)";

    private final boolean desiredState;

    public NfcStateRule(boolean desiredState) {
        this.desiredState = desiredState;
    }

    @Override
    public Statement apply(final Statement base, final Description description) {
        return new Statement() {
            public void evaluate() throws Throwable {

                try{
                    final Context context = InstrumentationRegistry.getTargetContext();
                    ensureNfcState(context, desiredState);

                }catch(final Throwable e){
                    e.printStackTrace();
                }
                base.evaluate();
            }
        };
    }

    private void ensureNfcState(final Context context, final boolean desiredState) {
        if(desiredState){
            ensureNfcIsEnabled(context);
        }else{
            ensureNfcIsDisabled(context);
        }
    }

    private void ensureNfcIsDisabled(final Context context) {
        if(isNfcEnabled(context)){
            toggleNfc(context);
        }
    }

    private void ensureNfcIsEnabled(final Context context) {
        if(!isNfcEnabled(context)){
            toggleNfc(context);
        }
    }

    private boolean isNfcEnabled(final Context context) {
        final NfcAdapter nfcAdapter = NfcAdapter.getDefaultAdapter(context);

        if (nfcAdapter == null) {
            return false;
        }

        return nfcAdapter.isEnabled();
    }

    private void toggleNfc(final Context context) {

        final Intent intent = new Intent(Settings.ACTION_NFC_SETTINGS);
        intent.setFlags(Intent.FLAG_ACTIVITY_NEW_TASK);
        context.startActivity(intent);

        final UiDevice device = UiDevice.getInstance(InstrumentationRegistry.getInstrumentation());

        findAndToggleNfcInUI(device);
    }

    private void findAndToggleNfcInUI(final UiDevice device) {

        final UiObject toggleButton = device.findObject(new UiSelector()
                .resourceIdMatches(NFC_TOGGLE_WIDGET_RESOURCEIDS));

        try{
            toggleButton.click();
            device.pressBack();
            return;
        }catch(UiObjectNotFoundException e){
            UiObject2 nfcMenuItem = device.findObject(By.textContains("NFC"));

            // Move up in the view hierachy until we're at a clickable item
            while(!nfcMenuItem.isClickable()){
                nfcMenuItem = nfcMenuItem.getParent();
            }

            // Issue click to navigate into menu
            nfcMenuItem.click();

            // Wait for any UI jitter to settle
            getInstrumentation().waitForIdleSync();

            // Try to toggle NFC button using this new child activity
            findAndToggleNfcInUI(device);
        }
    }
}

To use our NfcStateRule in an actual test, simply include it as a member and specify the desired NFC radio power state by passing a boolean with the constructor.

@LargeTest
@RunWith(AndroidJUnit4.class)
public class NfcDisabledTest {

    @ClassRule
    public static final NfcStateRule nfcStateRule = new NfcStateRule(false); // Make sure NFC is disabled

    ...actual test...
}

Voila, now it's possible to setup test scenarios correctly using the NFC radio. This is important for many of my Espresso tests to work consistently and reliably every time, as demonstrated by a screen below which tests the UI when the user has disabled NFC.

An example of an Activity/Fragment whos UI-state depends of the state of the NFC radio.

Conclusion

Where there is a will, there is a way! The above is not nearly as clean as having an API which we have available for WiFi, GPS etc. For acceptance testing however, I much prefer this kind of automated UI manipulation over mocking or polluting short-circuiting logic within the app itself.

By definition, the approach must be considered fragile since the NFC toggle button can be called something different on devices I have not yet had my hands on! If you run into this problem, the fix is easy - simply use the UIAutomator Viewer and expand the regular expression to work with this custom view. In a test scenario you usually have full control of the devices anyway so it's not really a practical concern since end-users will never be exposed to the code.

As usual, the code may be buggy, may not work on all versions of Android and is definitely not production safe. You may assume a Public Domain license of the code snippets above. Feel free to contribute back in the comments if you want to share your findings or experiences regarding the matter.

Thursday, January 5, 2017

BangBits Privacy Policy


Welcome to the BangBits Privacy Policy

When you use apps and other software developer by BangBits, you trust us with your information. This Privacy Policy is meant to help you understand what data we collect, why we collect it, and what we do with it. It is important to understand, that BangBits operate both as an owner of given software and as a proxy for work developer by Customers. App's published by BangBits but taking part of a specific Customer solution are treated separately in the "Specific Products" section below.


Information we collect and why we collect it

We collect information to provide better customer experience. This may happen through various forms of remote logging using Google Analytics, Firebase Analytics or similar tool. At no time is personal data directly mappable to an identifiable user being collected. What can be collected is:
  • Device identifiers (DeviceID, IMEI and handset identifiers) in order to black-list and/or white-list otherwise fraudulent and/or abusive users.
  • Stack traces and associated debugging data when app is behaving unexpectedly
  • Behavioral data to better understand how the user is using the software

Specific Products

The following notices explain specific privacy practices with respect to specific products offered by BangBits that you may use:

"Rejsekort Kontrol"

The software known as "Rejsekort Kontrol" is located on a closed business domain on Google Play, and as such, is only accessable to (invited) users of that Organization. The app under control of BangBits takes part of a bigger software system owned by Rejsekort A/S, the danish national transit ticketing authority. No personal identifiable data is collected or transmittet to/from the app. Recent travel data from a Travelcard is inspected and collected to be submitted for backend processing by Rejsekort A/S, but this is governed by Rejsekort A/S' own Privacy Policy at: 


https://www.rejsekort.dk/~/media/rejsekort/pdf/privatlivspolitik/privatlivspolitik---13-10-2014.pdf