Archive for the ‘Mobile’ Category

I log a lot of event data in Microsoft AppCenter. Thousands of events per day. Unfortunately, AppCenter has a hard limit of 200 distinct custom events for their event data. So, when you look at that great event filter they have, you’re not actually seeing all the data. HOWEVER, that does NOT mean AppCenter is limited to a scant 200 events. Hardly. Behind the scenes, AppCenter is powered by Application Insights. As long as you have storage available, you can keep all your events in App Insights. The problems I ran into were:

  1. I didn’t know Kusto, or KQL, the query language used by App Insights, and
  2. It’s not obvious how to access all the metadata I log for each event.

What’s metadata, you may ask? Well, it’s the additional data you log alongside an event. In AppCenter, the term for metadata is Properties, but I digress. For example, if you’re logging an Software Update Success or Failure, you may also include metadata about a product ID, model number, firmware version, and so forth. Finding the event data is easy. Finding and reporting on the metadata is not so straightforward.

So, dear reader, I have an example query for you. You can copy & paste this into your App Insights query editor and be good to go.

So here’s the query I used to extract a summary of how many software updates succeeded, for specific model numbers, with a count by model number:

customEvents
| where name in ("Software Update: Succeeded")
| extend jsonData = tostring(parse_json(tostring(customDimensions.Properties)).['Model Number'])
| where jsonData in ("model1", "model2", "model3", "model4")
| summarize count() by jsonData

So what’s happening here? Let me explain:

customEvents

This is the table App Insights stores all events sent by AppCenter.
| where name in ("Software Update Succeeded")

This filters those events by the event name.
| extend jsonData = tostring(parse_json(tostring(customDimensions.Properties)).['Model Number'])

This converts the metadata field – aka customDimensions.Properties – from JSON, extracts a particular metadata field – in this case, Model Number – and then returns that metadata value as a string.
| where jsonData in ("model1", "model2", "model3", "model4")

This is a simple filter query. I found if I wanted to get all model numbers I could simply use the following, though your mileage may vary:

| where jsonData !in ("")

And then finally:
| summarize count() by jsonData

This takes the results, by model number, and summarizes with counts. App Insights will even automatically generate a graph for you!

Refining the query above, and as you use more extend keywords to extract more data, you may want to use a more meaningful variable name than jsonData 😁 For example, here’s a more robust query I wrote, identifying the unique counts by total number of users performing the action:

customEvents
| where name in ("Software Update: Succeeded")
| where timestamp > ago(30d)
| extend modelNumber = tostring(parse_json(tostring(customDimensions.Properties)).['Model Number'])
| extend modelName = case(
modelNumber == "model1", "Headphones",
modelNumber == "model2", "Soundbars",
modelNumber == "model3", "Powered Speakers",
"n/a")
| where modelNumber !in ("")
| summarize count() by user_Id, modelName
| summarize count() by modelName

The two summarize filters let me break things down by total number of users with total number of product models. Here you can see how I used proper variable names and used the other useful customEvents data. This also helps me get data similar to the cool, useful graphs AppCenter shows on their dashboard.

You can do a lot more with this data, but it should get you started. I hope it helps you as it did me.

Additional tip:

Application Insights only stores data for 30 days by default. If you want to retain and report on your events beyond that timeframe, make sure you update your App Insights instance settings.

This past week, I had the opportunity to write an app to handle scanning in patients for CoViD testing in our city, Fishers, Indiana. Fishers is one of the top 10 places to raise a family in the United States. Reinforcing that reputation, the city is providing free CoViD testing to all 90,000 residents, free of charge. The process is simple:

  1. Register online via an assessment
  2. Receive a patient ID
  3. Scan in at appointment, assign test, and take test
  4. Receive answers within 3 business days

Seems simple, right? Our city isn’t using the painful “touch the back of your eyeball” technique, either. Instead, it’s at the front of the nasal cavity – simple, painless, and you get results. It also ensures contact tracing where there are infections, so as a community we can help prevent the spread of disease.

But can we do it?

The problem is, roughly a week ago, Step 3 didn’t exist. The patient ID was from a survey with a QR code. The kit didn’t have any identifier whatsoever – it’s just one of many bagged kits in a box. And this data was being manually entered by other entities running similar tests. To give you an idea of how prone to error this process is, consider the patient ID looks like this:

Sample Survey ID

Any chance for typos much? And that doesn’t solve the kit identifier problem – because there isn’t one.

Box of Tests
Figure: A box of test kits.

Solving the Problem

So, last Saturday, I received an email from our City’s IT Director. He wanted to know if I knew anyone who could marry the patients’ ID with the kit ID for proper tracking. If you know me, this type of project is right up my alley. This project screamed MOBILE APP! I said “I’ll do it!” It would be like a weekend hackathon!

Most cities aren’t offering CoViD testing, and not a single state is offering such a service to all residents. Fishers is different – we’re a “vibrant, entrepreneurial city,” as our awesome forward-thinking mayor, Scott Fadness, often exclaims. His team takes novel approaches to addressing community needs, and this was no different.

Where there is testing, as I understand it, it’s with PCs and handheld scanners. What a nightmare it must be to keep such a setup running – with a laptop, with software that’s manually installed, and patients scanned with a handheld that can’t scan through glass. I’ve worked with those setups before – the technology issues are a huge PITA. Let alone deploying updates in any timely fashion!

We decided at the get-go a mobile app would be the right approach. When asked about barcode scanners, I explained “We don’t need one.” The built-in cameras on modern day cell phones are more than capable of scanning any type of barcode, QR code, and so forth. As an added bonus, any cell phone we use will have Internet connectivity, with Wi-Fi as a backup. The beauty of it is one single device, one single app, everything self-contained and easy to use.

The beauty of it is one single device, one single app, everything self-contained and easy to use.

The Requirements

After our initial discussion, and a bit of back and forth, this was the decided-upon workflow:

  1. Scan the Patient QR Code and Kit ID.
  2. Come up with a Kit Scanning Process. We decided on CODE39 barcodes that would be printed beforehand so technology wouldn’t be an issue each day.
  3. Store the Patient ID and Kit ID for later retrieval. This ended up being “uploaded” to the survey itself, ensuring we didn’t need to store any PII, and didn’t have to build a back-end data store. Small favors…

And this was the mockup:

2020-04-25 Fishers CoViD App
Figure: Whiteboarded app.

Draft Napkin
Figure: Rough brain dump with ideas.

Initially, we talked about generating the Kit barcode on the mobile device, then printing it to a wireless printer in the testing bay. This certainly seemed possible. However, the more I thought about it, the more I realized we could simply pre-print the labels and affix them as needed. This would provide some obvious benefits:

  • We wouldn’t have to come up with a mobile printing solution, which can be tricky, and is not a simple problem to solve cross-platform.
  • We’d keep a printer breakdown out of the picture, ensuring technology didn’t “get in our way”

The key is to get the patients in, tested, and out as efficiently as possible. The simpler we kept the process, the less could go wrong. So, on-demand printing was eliminated, and we’d simply pre-print labels instead. They’d be affixed to the test kit and then assigned to a patient.

Common Needs of Mobile Apps

Determining the app dvelopment approach, I took into consideration every mobile app I’ve built generally has had three primary needs that must be addressed:

    1. Where does data come from? Usually this is an API or the user. If an API doesn’t exist, considerable time is necessary to build one.
    2. Where does data go? Also usually an API for some data store. Same API issue.
    3. How does the user interact with data? The app is useless if the user can’t figure it out and, if possible, enjoy the experience. This can have design cost and time impacts.

For Need 1, we knew we had a QR Code. BUT how would we know it’s valid? How would we get the patient data? Well, it just so happened the survey provider had an API. Sweet! We hopped on a call with them and they provided an API  key and documentation. That’s how API access should work! They even provided a RegEx to validate the scanned patient ID, which internally to then was actually just a survey ID.

What about the kits? We decided to use a CODE39 barcode font and print on standard Avery labels. We came up with a standard naming and numbering convention, a RegEx to validate, and would pre-print them – a few hundred per day. This would ensure the labels were verifiable after scanning, and that printing wouldn’t be an issue each day. We’d take care of technology problems beforehand – such as printer issues – so they wouldn’t impact patient processing.

Barcodes on Avery Labels
Figure: A combination of Excel to generate the labels following the naming convention, plus mail merge to insert into on the off-the-shelf labels.

OK, now for Need 2… We can get the data, but we have to store it somewhere. Initially, we thought about building a separate back-end. The survey provider, Qualtrics, explained we could send the data back to their system and store it with the initial survey. Well, that was much better! No new storage API development was needed, as they already had the infrastructure in place. Building a new, solid, secure API in a short period of time would have been no small task.

For Need 3, the user experience, I borrowed my grandfather’s phrase: It must require a PH, D. Push Here, Dummy. I wanted “three taps and a send,” as follows:

  1. Scan the Patient – Once scanned, look the user up and verify they exist, showing confirmation details on the screen.
  2. Scan the Kit – Ensure the barcode matches the expected format.
  3. Confirm & Submit – Prompt to ensure patient details, such as name and postal code, have been verified, then confirm the entry has been saved.

It must require a PH, D. Push Here, Dummy.

That’s it – no chance for typos, and verification at every step, helping things go right. Little animations would show when a step had been completed, and scans could be done in any order.

Picked up 5600 Labels
Figure: Texting back and forth, getting our bases covered.

Xamarin As The Dev Stack

We’re building a mobile app here, and we may need it for multiple platforms. iOS – both iPhone and iPad, Android, and perhaps even Windows. Building each platform separately would take a lot of time to complete – time we didn’t have. It was April 25, we needed to be testing by April 27, and we were going live May 1.

The right choice was Xamarin with Xamarin.Forms – Microsoft’s cross-platform mobile framework. It’s similar to React Native, but you have full access to the underlying platform APIs. That’s because you’re building a real native app, not an interpreted overlay. With a single solution, we could build the iOS, Android, and UWP (Windows) apps, with 90% or more code sharing. I’m a Xamarin certified mobile developer, so this was going to be fun!

Solution Explorer
Figure: The Xamarin app in Visual Studio.

First Draft

Within a few hours, I had an alpha version of the app running. It was rough, and didn’t have the best UI, but it was scanning and talking with the Qualtrics API. Hey, once the base stuff’s working, you can make it look pretty!

The app consisted of a few core components:

  • App Service – Managing any processes the app needed completed, such as retrieving patient survey details, updating a patient survey record, verifying scanned code formatting, and so forth.
  • API Service – Talking back and forth with the Qualtrics API.
  • Analytics Service – Tracking aspects of the application, such as kit scan successes and failures, any exceptions that may occur, and so forth, so we can improve the app over time.

Build 1
Figure: Build 1 of the app. I first tested with my Android device, then rolled out to iOS, testing on both an iPhone and iPad.

I also had to ensure scanning went off without a hitch. After all, that’s what this app is doing – getting all the data quickly, then tying it together. I configured the scanning solution to only scan QR codes when scanning the patient ID, and only CODE39 barcodes when scanning kits. That way, if the codes were next to each other, the tech wouldn’t scan the wrong item and cause confusion. Remember, the technicians are medical techs, not computer techs – any technology problem could stop the patient processing flow. We needed to ensure the technology didn’t get in the way. You do that by testing thoroughly, and keeping the end user in mind.

Testing Scanning
Figure: QR code and CODE39 barcodes for testing.

Final Approach and User Experience

Once the UI was working, I added final touches to the UX to make it friendly and easy to use:

  1. When a technician successfully scanned a patient, the information would appear and a green checkmark would animate in. This would clearly indicate that step was completed. If there was an issue with the verification, they would be prompted to scan again. Optionally, they could manually enter the patient ID, which would follow the same validation steps.
  2. When a kit was scanned, another green checkmark would animate in, signifying that step, too, was complete.
  3. Once both steps had been completed, the technician would clearly understand the two greens meant “good to go” and could submit the patient data. They would be prompted to confirm they had verified all patient data and everything on the screen was correct.
  4. Once patient data was successfully transmitted, a confirmation dialog would appear. Upon dismissal, the UI would animate to the reset state, making it clear it’s OK to proceed to the next patient.

Devices, TestFlight, and Apple, Oh My!

So the app was in a good state. How were we going to get it on devices? This isn’t an app that we want in the App Store. It’s not a general consumer app – at least, not yet. TestFlight to the rescue! We’d push it to Apple’s TestFlight app testing service, then enroll all the iOS devices. That would ensure that, as we tweaked the app, we could quickly push the updates without any messy manual installs.

For those that have deployed iOS apps before, you know this isn’t a fast process. The first version of any app into TestFlight must be reviewed by Apple. I uploaded the first version and waited…

Roughly a day later, Apple rejected the app. BUT WHY? Well, we hadn’t provided any sample QR codes or bar codes to scan, so they rejected it. UGH! Really?? I didn’t even know that was a testing requirement! You learn something new every day… So I sent a URL with some examples to test with, as you can’t upload files to the testing site, and waited. Hours later, thankfully, Apple approved the app for testing!

App Store Test Rejection
Figure: Apple beta review rejection email.

We enrolled the various iPhones and iPads in TestFlight and we were able to start testing. Other than a restriction with SSL over the City’s network, which was quickly resolved, we had our devices ready to go. Not bad for under 48 hours!! 

Note that, once an app is in TestFlight, additional builds go through almost instantly. This ensured we could tweak as needed and not wait 24+ hours to validate each time.

TestFlight Versions
Figure: We could release updates with velocity after the initial approval.

Rolling It Out – Dress Rehearsal

We wanted to make sure the app worked without a hitch. A day before release, we had a “dress rehearsal.” Everyone would be ready for the testing, and we’d introduce the app. It’s a small part, but it ties it all together. Tracy, the I.T. Director, and I had been testing in earnest prior to this time, and we were feeling pretty good about it.

That morning, I walked the users through the app, joking about the PH, D requirement. Prior to my arrival, they had been testing the process on one of our citizens, who must have been quite tired from all the work:

Test Patient

The pressing questions were:

  • Can we scan a QR code through glass, so a citizen doesn’t have to roll down their window? Yes, unless it’s super tinted, which would be illegal anyway.
  • What if we can’t scan the code? This wasn’t an issue, except for a QR code variant issue discussed later, and manual entry was supported just in case.
  • What if Internet access goes down? We had cellular backup on all devices.
  • How will we apply the barcode to the kit? Peel and stick, then scan. We scan after removal from the main sheet so we don’t scan the wrong code. In a later version we added a prompt when the scanned patient had already been through the process.
  • What if the QR code is used more than once? This wasn’t an issue, as the name and appointment time wouldn’t match.

Here are a few photos from that morning – that was a lot of fun!

20200430_09065520200430_09070720200430_090718Box of TestsCheck InP100 MasksPresenting the AppRoad to TestTest CompleteTest Vial

    Day 1!

    Day 1 was here, and real citizens were about to get tested. I slept well the night before, knowing we had tested thoroughly. We only had one hiccup: The QR code in the email was different than the QR code on the confirmation website. This was causing validation errors, as the website QR code’s patient ID couldn’t be found in the system. Not an app issue, but that doesn’t matter.

    Couldn't find Patient ID
    Figure: Ruh-roh! The QR codes weren’t matching between different sources. Yellow alert!

    The survey provider quickly addressed the issue and we were good to go. It wasn’t a big deal – they provided a website to manually enter the patient ID for scanning with a properly generated QR code, and it barely impacted patients. Day 1 was a rousing success!

    2020-04-30 In the Field
    Figure: The app in use!

    Wrapping Up

    Going from no-app to app being used with patients in less than one week was an incredible experience. It feels great to help the community during this period of uncertainty. I’m grateful our city wanted to make the process as seamless as possible, using technology to help things go right, and providing the opportunity me to assist. I’m thankful that, once again, Xamarin was a great solution.

    I’ll probably have a technology walk-through in the near future – I didn’t want to concentrate on the underpinnings of the application for this article. I’ll leave that to a discussion for the Indy Xamarin Meetup.

    Final VersionFigure: The final app, with my info scanned in.

    As part of my .NET 301 Advanced class at the fantastic Eleven Fifty Academy, I teach Xamarin development. It’s sometimes tough, as every student has a different machine. Some have PCs, others have Macs running Parallels or Bootcamp. Some – many – have Intel processors, while others have AMD. I try to recommend students come to the class with Intel processors, due to the accelerated Android emulator benefit Intel’s HAXM – Hardware Acceleration Manager – provides. This blog entry is a running list of how I’ve solved getting the emulator running on so many machines. I hope the list helps you, too.

    This list will be updated from time to time, as I find new bypasses. At this time, the list is targeted primarily for machines with an Intel processor. Those with AMD and Windows are likely stuck with the ARM emulators. Umm, sorry. I welcome solutions, there, too, please!

    Last updated: December 4, 2017

    Make sure you’re building from a path that’s ultimate length is less than 248 characters.

    That odd Windows problem of long file paths bites us again here. Many new developers tend to build under c:\users\username\documents\Visual Studio 2017\projectname. Add to that the name of the project, and all its subfolders, and the eventual DLLs and executable are out of reach of various processes.

    I suggest in this case you have a folder such as c:\dev\ and build your projects under there. That’s solved many launch and compile issues.

    Use the x86 emulators.

    If you have an Intel processor, then use the x86 and x64 based emulators instead of ARM. They’re considerably faster, as long as you have a) an Intel processor with virtualization abilities, which I believe all or most modern Intel processors do, and b) Intel’s HAXM installed.

    Make sure VTI-X / Hardware Virtualization is enabled.

    Intel’s HAXM – which you can download here – won’t run if the processor’s virtualization is disabled. You need to tackle this in the BIOS. That varies per machine. Many devices seem to chip with the feature disabled. Enabling it will enable HAXM to work.

    Uninstall the Mobile Development with .NET Workload using the Visual Studio Installer, and reinstall.

    Yes, I’m suggesting Uninstall + Reinstall. This has worked well in the class. Go to Start, then Visual Studio Installer, and uncheck the box. Restart afterwards. Then reinstall, and restart.

    Mobile Development Workload Screenshot

    Use the Xamarin Android SDK Manager.

    The Xamarin team has built a much better Android SDK Manager than Google’s. It’s easy to install HAXM, update Build Tools and Platforms, and so forth. Use it instead and dealing with tool version conflicts may be a thing of the past.

    Make sure you’re using the latest version of Visual Studio.

    Bugs are fixed all the time, especially with Xamarin. Make sure you’re running the latest bits and your problems may be solved.

    Experiment with Hyper-V Enabled and Disabled.

    I’ve generally had issues with virtualization when Hyper-V is enabled. If you’re having trouble with it enabled, try with it disabled.

    To enable/disable Hyper-V, go to Start, then type Windows Features. Choose Turn Windows Features On or Off. When the selection list comes up, toggle the Hyper-V feature accordingly.

    Note: You may need to disable Windows Device Guard before you can disable Hyper-V. Thanks to Matt Soucoup for this tip.

    Use a real device.

    As a mobile developer, you should never trust the emulators to reflect the real thing. If you can’t get the emulators to work, and even if you can, you have the option of picking up an Android phone or tablet for cheap. Get one and test with it. If you’re not clear on how to set up Developer Mode on Android devices, it’s pretty simple. Check out Google’s article on the subject.

    Try Xamarin’s HAXM and emulator troubleshooting guide.

    The Xamarin folks have a guide, too.

    If all else fails, use the ARM processors.

    This is your last resort. If you don’t have an Intel processor, or a real device available, use the ARM processors. They’re insanely slow. I’ve heard there’s an x86 emulator from AMD, yet it’s supposedly only available for Linux. Not sure why that decision was made, but moving on… 🙂

    Have another solution?

    Have a suggestion, solution, or feature I’ve left out? Let me know and I’ll update!

     

    I’m continuing my resolution to record as many of my programming and technical presentations as possible. I recently spoke at the inaugural Indy.Code() conference. It was excellent, with an incredible speaker line-up. I hope they, too, post some of their presentations online!

    Watch the Video on YouTube

    From the synopsis:

    Should you write your app “native” or use a “cross-platform” solution like React Native, Xamarin, or NativeScript? The new wave of native-cross-compiling solutions provide significant cost savings, code reuse opportunities, and lower technical debt. Does wholly native, per platform development, still play a role in future mobile development? Let’s discuss together.

    In this presentation, we’ll discuss:

    • The growth of native, hybrid, and cross-platform mobile development solutions
    • Cost analysis of multiple native and cross-platform apps
    • Considerations for each native and cross-platform solution
    • Lessons learned

    Slides are available here: https://t.co/5iLhEoEfen

    If you have any questions, I’m happy to answer them! Please email me or ask on Twitter.

     

    Alright, I found a Moto 360 and I’m enjoying it. The following is not my review. It is a list of bugs Motorola and Google need to fix on this device and across Android Wear. Note this is only what I’ve noticed after one day. I’ll post more as I explore.

    • When you take the phone out of the box, it doesn’t turn on or has a low battery. That’s understandable. What’s not alright is no prompt about the battery level or what to do. It’s simply “Connect your device to Android Wear,” or something to that effect. That’s very un-user-friendly. Where were the UX guys with the setup process?
    • Only one watch face shows the date. $250 and no date? Seriously?
      • Update, thanks to Rich DeMuro: Drag down slightly to see the date.
    • When asking the watch to make a call to a contact with more than one number, it asks "Which One?" However, it doesn’t give you a list. Saying "the first one" works, but I don’t know what I selected until it dials.
    • There’s no confirmation request when sending a text… it just sends it.
    • It sometimes stops listening or lists your options when listening.
    • It sometimes starts listening when it shouldn’t.
    • Carrier messaging apps break the ability to reply to texts. I had to disable Verizon Messaging entirely.
    • Facebook support for displaying the new comments would be nice, like the email display feature.
    • There’s no battery level meter anywhere on the device, or at least that’s obvious.
      • Update, thanks to Rich DeMuro: Drag down slightly from the top to see battery level.
    • The Android Wear app doesn’t show battery level, but Moto Connect does. Weird?
    • Sometimes Google search results take precedence over actions. For example, saying "play ebay by weird al" brings up YouTube results. However, "play technologic by daft punk" plays the song. It’s hit or miss.
    • So far, adding a calendar entry hasn’t worked.
    • There needs to be a notification center to control which notifications are sent to the phone. Yes, you can do it via the App Manager, but it’s horrible.
    • The accelerometer doesn’t always sense the wrist has been moved to a viewing angle.
    • When driving, the accelerometer appears to trigger the display to turn on *a lot*. It’s not good when driving kills your battery.
    • A speaker would be helpful for prompts.

    Added 9/15 afternoon:

    • The Motorola Feedback website doesn’t list the Moto360 as a product. So, how do I register it or get support?
    • The device occasionally says its Offline when the phone is only three feet away. I’m thinking this is a bug in the Google Now integration and not an actual communications issue.
    • Asking the device "What is the battery level" always causes the phone to report it’s offline

    Added 9/17:

    • Saying “Call <insert name here> on cell” doesn’t work most of the time, but saying the same “on mobile” is generally reliable.
    • Calling “Send text to <insert name here>” sometimes asks “Which one?” but only shows the phone numbers. I wasn’t sure if I was sending to the right person because the name wasn’t listed.
    • Most of the time, when the screen turns on when moving even the slightest, the watch starts listening, even if I don’t say “Ok, Google”. It’s very annoying.
    • It would be nice if “Ok, Google” could be changed to something else. I feel like I’m advertising Google every time I use my watch.
    • The pedometer seems inaccurate, rendering phantom steps as far as I can tell. The inaccuracy extends to the heart rate monitor. After a long workout, the monitor said I was at 74 bpm, then 90. I took my own pulse, and it was quite off the mark.

    Added 9/30:

    • The latest build, 4.4W.1 KGW42R, has greatly improved battery life. On an average day of use, unplugging the watch at around 7am, I was still at 20% at roughly 9:45pm. Great job, Motorola!
    • Even with Messaging as the default app, I have no option to Reply to texts when the notification appears. This may be due to HTC overriding some default app, but I’m unsure.

    A few tips:

    To launch apps, go to the Google screen, then go to Start… and you can select an app.

    You can say the following things and it’s really cool:

    • Call <person’s name> on mobile
    • Play the song <song name>
    • Play the song <song name> by <artist name>
    • What is the current stock price of <company name>

    I’ve been putting off finishing my HTC One M8 review for a couple months. I’m hoping to finish it soon, but for now, here’s my draft…

    A Dilemma

    Before I start my review, I need to explain the technology dilemma of new phones, and new laptops and desktops, too, for that matter. Technology has come to a performance and feature point that it’s hard for manufacturers to prove any necessity their new products in these categories. Case in point – my previous phone, the Galaxy Nexus, was perfectly fast for everything I did with it. Sure, it wouldn’t launch apps or take photos as quickly as the newer devices, but it was acceptably fast, so much so that, as I shopped for a new product, the newer devices weren’t obviously beneficial.

    I imagine my dilemma similarly affects the PC market. For the average consumer, is the laptop of today that much better than the laptop of two years ago? If you spend most of your time plugged in, as many users I’ve met do, will they notice the processor speed? The display? They’ll definitely recognize the SSD speed and touch. Yet their old systems are acceptably fast. Lucky for them, new laptops are affordable. Desktop PCs? That’s a different story – there’s nothing really new about them that you’d need to upgrade, and you don’t see many shipping with SSDs.

    Phones, unlike laptops and desktops, are lucky in that they are a) popular to drive consumers to buy when upgrades are unnecessary, and b) have sex appeal. You rarely tell anyone these days about their chic new laptop. Well, you used to… That desire has shifted to the phone, now a mini laptop in itself. Yet, beyond the better battery life, what makes a phone better today, other than you can get a new model up front, and paid off [again] in two years?

    Anyway, I ignored all that introspection and needs analysis. I bought HTC One M8.

    The Phone

    First, let’s talk about the One. It’s beautiful. It’s slick. A bit too slick, as the aluminum is so smooth I often was afraid it would fall out of my hands. Thankfully, HTC provides one free screen replacement in the first six months. I like little support touches like that. The HTC Dot View case solved my grippiness issue, which I’ll discuss below. Wow, though – it’s a beautiful phone. I had a number of people ask me “Hey, what phone is that?” and often times heard “I think I’ll be switching to an Android phone next. Wow, that screen is big.” Maybe Google should be courting HTC for the its next Galaxy phone?

    The Camera

    The HTC One takes great photos. So why isn’t it my favorite camera? First, we need to explain the difference between HTC’s approach to phone cameras compared to practically everybody else: bigger pixel sensor size versus more pixels. The One sports 2 micron sensors vs. the 1.3 micron sensors used by practically every other flagship phone from Samsung, Google, and even Nokia. However, it only has a 4 megapixel effective resolution, versus 13+ on the others. True, the larger sensors bring in more light, and make the HTC One an excellent low light level camera. But when it comes to image quality, that lack of additional resolution makes every shot a make-it-or-break-it affair. With a 16 megapixel imager, for example, you could get a large shot and crop to something perfect. But with 4 megapixels, you’ve got to get it right the first time, lest you risk cropping to Facebook resolution. Definitely nothing good to print, and sometimes so few pixels there’s nothing good to display, either.

    To be fair, the One takes excellent photos. Albeit quite a bit overexposed when there’s too much light… You can’t get balanced exposure between, say, the sky and the grass on a partly cloudy day. If you focus on the grass, the sky turns white. If you focus on the sky, the grass turns almost black. It sounds like something that can be solved with software… I’m hoping HTC has something in the works.

    A few bugs I noticed, in case HTC is listening:

    • You can’t add stickers to a photo taken with a flash or low light. I have no idea why.
    • U Focus is not available for flash or low light photos, either.
    • Facebook uploads from the HTC One M8 appear to be very low resolution. I’ve seen this issue on many HTC Android phones. It looks like HTC has their own Facebook for HTC, but I can’t exactly confirm which uploader is being used when sharing.

    The Dot View Case – The Sleeper Accessory Success story to what Austin Powers was to Sleeper Movie Successes

    Long title, but true. The Dot View case may seem like a gimmick, but it does a great job at what it’s supposed to do. Lined with little holes that form letters and shapes when combined with the One’s screen gestures, you can check the time, make a phone call, answer and decline phone calls, and see if you have any messages all without ever looking at your screen. Samsung and other manufacturers have done similar things by putting cutouts in cases, too. Yet HTC’s approach is unique, and very, very cool. I think many folks who have seen my little demos of the Dot View case are thinking the One is their next phone. Maybe it’s just sheer luck for HTC, but I don’t think I’ve met anyone who’s contract isn’t about to expire this year. Good thing I’m not in charge of a survey! <grin>

    Ok, learned this sort of the hard way today… I picked up the brand spanking new HTC One M8 yesterday. So far it’s a fantastic phone. I wanted to add a 32 GB MicroSD card, since it wonderfully supports such expansion. Beware! There’s a little tray that comes out when you use the paper clip in the little hole. Put the MicroSD card in that tray! I thought it was simply a placeholder at first, so I slyly proceeded to simply insert the card into the hole. Whoops!

    imageIf you fall into the same trap, it’s easy to get the MicroSD card out. First, you might as well finish the formatting steps – it’s in there anyway. When that’s done, use the paperclip to release the MicroSD card from the tray. Yes, I know it won’t come out all the way. After releasing it via the eject hole, use the side of the paperclip to gently pull the card out from the right side a little bit. Once you can see the plastic of the card, pull it out the rest of the way with your fingers. Problem solved!

    Good luck!

    -Auri

    I picked up a Dell Venue 8 Pro for $99 as part of Microsoft’s 12 Days of Presents spree. Here are some tips & tricks for the more techy folks out there:

    How to Access the BIOS

    Press the power button once. Then hold down the Volume Down button until the Dell logo disappears. You don’t need a keyboard – it has an on-screen mouse mapped to the touch screen. Cool, eh?

    To access the Advanced settings of the BIOS, follow the instructions through Step 7 below:

    How to Speed Up SSD Disk Access by Modifying the EFI / BIOS

    Thanks to Sasha for the following steps, which can increase speeds by over 50%!

    1) From Windows, bring up the charms (swipe in from right)
    2) Select Settings -> Change PC Settings, or Start, then All Apps, then PC Settings.
    3) Choose Update and Recovery -> Recovery
    4) Under Advanced Startup, select Restart Now
    5) From this blue menu, select Troubleshoot, then select Advanced Options
    6) Select UEFI Firmware Settings, then click Restart
    7) Now, the BIOS shows up, hit the on-screen ESC button ONLY ONCE.
    8) You’re now in the Main “tab”, with a vertical list of options, from here you must select Advanced, this lets you see all the BIOS settings and is different from hitting the Advanced tab across the top.
    9) Select LPSS & SCC Configuration
    10) Select SCC eMMC 4.5 HS200 Support and select Enabled (Mine was disabled by default)
    12) Select DDR50 Support for SDCard and select Enabled (Mine was disabled by default)
    13) Press F10 on the on-screen keyboard to save, then Save Settings and Exit and you’re all set.

    Getting Back ~5 Gigabytes of Space by Removing Recovery Partition

    The Dell Recovery Partition is essential for restoring your machine should something catastrophic happen. To add insult to injury, Dell often runs out of stock of recovery media, and won’t send you such after a year or two has passed. That’s hit me before, and it’s not fun. So, make sure you’ve backed it up!

    Once you’ve backed up that recovery partition, there’s no point in keeping it. Get those gigs back!

    Here’s how:

    NOTE: Make sure you have at least 50% of your battery left for this process. I wouldn’t do this when hitting the lower ends of the battery spectrum.

    1. Go to All Applications and scroll all the way right to the Dell group. Tap the My Dell application.
    2. Click Backup, even if it says no backup software is installed.
    3. Click the Download Local Backup button. This will provide a link to download Dell Backup and Recovery, which you should download and install. Basically, once you click the Download button, select Run and wait for Setup to do its job. This process can take a long time. Even the download appears to be huge. It’s probably downloading the latest recovery data, but that’s just a guess.
    4. After the software has installed, it will request a restart. So, restart the tablet.
    5. Go to All Applications and back to the Dell group. Note the new Dell Backup and… option. Tap it.
    6. Wait a few moments for the cool clock animation to complete, then agree to whatever terms are presented, or not.
    7. Tap the Reinstall Disks option. This is the equivalent of a Factory Restore partition backup.
    8. Tap USB Flash Drive, which is probably the only real option you have with this unit. This includes use of the Micro SD card, which is what I used, since I didn’t have a USB adapter handy. If you decide to use an external burner, that’s cool, too. But… why?
    9. Select your USB drive, or the MicroSD card. I backed up to an 8 GB MicroSD. Dell estimates the backup at 4.03 GB, so 8 GB should suit you just fine.
    10. Tap Start, then tap Yes when asked if you’re sure about wiping out the USB or MicroSD drive. Of course you’re sure! (right?)
    11. Wait until it’s done.
    12. When it’s complete, click OK, and put the backup media in a safe place. I put it in my Venue Pro’s box.
    13. Go back to Start, then All Programs, then Desktop.
    14. Hold down on the Start button and select Command Prompt (Admin).
    15. Type diskpart to launch the Disk Partition manager.
    16. Type list partition to see the available partitions.
    17. Type select partition X, where X is the number of the approximately 4 gigabyte recovery partition. On my Venue, it was 6.
    18. Make sure you see “Partition X is now the selected partition”!!!
    19. Type delete partition override and hit enter.
    20. You should be greeting with “DiskPart successfully deleted the selected partition.”
    21. Type exit to quit DiskPart, then exit again to quit Command Prompt.
    22. Now that the partition is gone, we need to expand the size of the main partition.
    23. Open an Explorer window and long press This PC, then select Manage.
    24. When Computer Management appears, select Disk Management under Storage.
    25. You should see the 4.64 gigabytes or so we freed up showing as Unallocated.
    26. Long press your C: drive and select Extend Volume….
    27. The Extend Volume Wizard appears. Click Next.
    28. You’ll be asked where the space to extend the volume should come from. Everything should already be filled out to assign the maximum unallocated space. Simply tap Next or adjust as desired and click Next.
    29. The wizard will confirm the extension settings. Click Finish.
    30. There you go! Your C: drive is now almost five gigabytes larger!

    UPDATE: You can also back up to a USB drive by acquiring a USB OTG, or “On-The-Go”, adapter. Pick one up from Fry’s, SKU number 7582626, here. This will also enable you to use thumb drives and such on your Dell Venue 8 Pro.

    Disable the Annoying Backlight

    Dell’s power management settings for the backlight are wretched, making the display dim almost all the time. Let’s get around that, shall we?

    1. Swipe out the charms menu, then select Settings, then Change PC Settings on the bottom.
    2. Select PC and devices.
    3. Select Power and sleep.
    4. Set Adjust my screen brightness automatically to Off.

    Below are my notes from Day 1 of the CEATEC show in Makuhari, Japan.

    SAM_8159

    Sony Info-Eye + Sony Social Live

    Sony showcased two unique social apps, Info Eye and Social Live, part of their Smart Social Camera initiative.

    SAM_8244

    Info Eye takes an image and analyzes it for different types of information, surfacing that information in different views. For example, take a photo of the Eiffel Tower and you are presented with different "views" of the data. The first view may be related photos of the French attraction, such as a view from the top, or the Eiffel Tower Restaurant. Change views and you’re presented with a map of Paris. Continue to the next view and see your friends’ comments on social networks about the attraction. It certainly is an innovative approach to automatically get benefits from simple photo taking – photos you normally wouldn’t look at again anyway.

    A video is worth thousands of pictures, and you already know what those are worth:

    And in case you simply want a picture:

    SAM_8249

    Social Live is like a live video feed, streamed from your phone to various social services. While the example of a live marriage proposal wasn’t so realistic, Social Live still has great consumer applications. For example, set a social live link on Facebook and your friends could view your video feed while you tour the Champs Elise in Paris, without your needing to initiate a Skype call. It’s similar to having a live broadcast stream ready to go at any time.

    3D 4K Everywhere!

    3D didn’t entice the world – again – so, why not re-use all that marketing material, swapping 4K for 3D? No, it’s not that bad, and 4K is beautiful, but it’s just too early, too expensive, as is almost every evolutionary technology like this. Just for fun I  made a collage of the various offerings. Component innovation is once again creating products at a pace greater than the consumers’ willingness to adopt.

    4K_AutoCollage_12_Images

    Tizen IVI Solutions at Intel

    Intel had a sizeable display of Tizen OS based In-Vehicle Infotainment solutions at its booth. Apparently Intel had 800 developers working on Tizen while partnered with Nokia on the OS-formerly-known-as-MeeGo. The most interesting Tizen demonstration was Obigo’s HTML5-based IVI solution. On a related note, Samsung is apparently folding their Bada OS into Tizen. It will be interesting to see whether it makes any difference in the global mobile OS movement, still dominated by Android, then iOS, then Windows Phone.

    SAM_8250

    Obigo’s HTML5-based In-Vehicle-Infotainment Solution

    Obigo’s solution is to automotive application development what PhoneGap is to standard mobile application development. Developers build widgets using HTML5 + JavaScript, accessing vehicle data and services via an abstraction layer provided by the Obigo engine. Apps in Obigo’s system are called widgets. Nothing appears to prevent Obigo from bringing this solution to Android, so look for that possibility on the various Android vehicle head units coming to market. Hyundai and Toyota will be the first integrators of the system.

    SAM_8213

    Apparently Japanese Car Insurance is Very Expensive

    Another solution shown at the Intel Tizen display was a driving habits monitor capable of sending an email to your insurance company with said information. The goal would be to lower insurance rates. The solution was a hokey implementation at best, but at least I’ve learned insurance is expensive here as well.

    Fujitsu Elderly Care Cloud

    In an effort to keep Japan’s increasingly elderly population in touch with their families, Fujitsu has created a "Senior Cloud." The benefit to seniors will apparently be video and photo communication and sharing services with their family, alongside healthcare detail sharing services. I couldn’t get a demo, but it sounds like a good idea. For the next 10-20 years, anyway – by then, the "elderly" will have become the people who know how to do these things.

    SAM_8221

    ModCrew iPhone Waterproofing Coat

    ModCrew displayed a nano-coating solution for iPhones (only), rendering your fruit phone washable.

    clip_image001

    clip_image002

    Omron Basil Thermometer with DoCoMo Health Smartphone App

    Omron has a unique line of basil thermometers, with pleasant shapes and colors, targeted (obviously) towards women. The devices, among other Omron health device solutions, can all transmit their data via NFC to phones and tablets. Using an app from NTT DoCoMo, health data can be consolidated and analyzed, and health advice can be provided.

    clip_image003

    All health components gather data to recommend healthy choices.

    clip_image004

    Huawei Phone with Panic Alarm

    Chinese consumer and mobile electronics provider Huawei showcased their HW-01D feature phone with a built-in panic alarm. Targeted towards women, children, and the elderly, the device has a pull tab that sets off a loud, yet oddly pleasant, siren to scare away would-be perpetrators.

    SAM_8252

    Fujitsu Finger Link

    Fujitsu’s Finger Link solution uses a top-mounted camera to convert physical objects to virtual objects, enabling you to organize and relate such items for later manipulation. For example, put 3 Post It notes down and they are converted to digital representations, automatically recognized as separate objects. Tap each related item and drag a line between others similar to the first. Tap a button on the projected interface and now they’re related, moveable, sharable, and more.

    clip_image006

    Fujitsu Sleepiness Detection Sensor

    A hot item in vehicles displayed at CEATEC this year was detection of distracted driving. Fujitsu’s component detects eyes moving away from the road, a downward or upward motion possibly signifying the driver is drowsy. The component is for use by automotive integrators.

    clip_image007

    Fujitsu big data + open data quiet service, LOD utilization Platform

    Fujitsu showcased an open LOD utilization platform for quickly and easily mining and analyzing the data from many Open Data sources all at once, visually. The back-end is using the SPARQL query language.

    clip_image008

    Mitsubishi 4K LaserVue

    Mitsubishi showcased a prototype 4K Red Laser + LED backlit display, enabling a beautiful, beyond photorealistic video display. Standing in front of the reference unit, I actually felt like I was looking through a window – the colors were amazingly vivid and lifelike.

    SAM_8267

    clip_image010

    Mitsubishi elevator skyscraper swap detection system

    Mitsubishi also showcased a solution for preventing elevator stalls in swaying skyscrapers. Their sensor moves the elevator cart to a non-swaying or less-swaying floor to prevent service outages, keeping the elevators running as efficiently as possible, and giving you one less excuse to miss that meeting.

    clip_image011

    Mitsubishi 100Gbps optical transmission technology

    Mitsubishi showcased a 100 gigabit/second inter-city optical interconnect solution, with a range up to 9000 kilometers.

    clip_image012

    Mitsubishi Vector Graphics Accelerating GPU

    Who says you need multi-core ARM processors running over 1 GHz + powerful GPUs for beautiful embedded device interfaces? Mitsubishi sure doesn’t. They showcased a GPU running at a scant 96 MHz, accelerating vector graphics display at up to 60 frames per second. Incredibly responsive interfaces for elevators and boat tachometers were displayed. The target is rich user interfaces with incredibly low power consumption.

    Related notes:

    SAM_8265

    Mitsubishi Rear Projection Display for Automotive

    It’s no surprise Mitsubishi is proposing rear projection solutions for automotive – RP is one of the company’s strengths. What they propose is curved surfaces to provide an interface that matches the interior of the vehicle. Also possible is 3D-like interfaces, as shown below.

    clip_image013

    Sharp Frameless TV Concept

    A display with no bezel? Sharp’s frameless concept showcases how beautiful such a solution would be. That it in the center.

    clip_image014

    Sharp Mirror Type Display

    Also on display (ahem) was the Mirror Type Display, with a display built into a mirror. Have I said display enough times?

    Pioneer Wireless Blu-ray Drive

    That shiny new ultrabook is pretty svelte, isn’t it? What’s that? You want to watch a Blu-ray? That’s fine – just use Pioneer’s BDR-WFS05J solution to wirelessly connect to the Blu-ray drive across the room and stream the data over 802.11N, as long as it’s in its dock. The unit also supports USB 2 and 3. Ships at the end of September.

    clip_image015

    Toyota Smart Home HEMS Using Kinect

    Toyota showcased a smart home energy management system (HEMS) using Kinect to interact with various residents.

    Toyota Concept Vehicles

    I don’t know much about the following one-person electric riders, but they looked cool, so enjoy the photos.

    clip_image016

    clip_image017

    Clarion Smart Access + EcoAccel

    Determining whether you’re driving Green, or "Eco" as they say in Japan, can be difficult. Clarion’s EcoAccel app, which runs on their Android-powered head unit, reads ODB2 sensor data to rate your Eco driving habits. It’s an entertaining way to enhance the eco-friendliness of your driving routine. The representative said there are no current plans to bring this product Stateside, but I’m hoping they change their mind. After all, ODB2 data is pretty easy to read, even if it’s not entirely standardized.

    clip_image018

    clip_image019

    clip_image020

    Mazda Heads Up Cockpit

    While the HUD component is nothing to write home about, Mazda’s approach of keeping everything at eye level, while re-organizing the shift knob to also be easily manipulated was a welcome safe-driving-meets-ergonomics approach. Better yet, they will be shipping this in their Axela vehicles, meaning less expensive vehicles may be readily receiving technology to deter distracted driving. They call this the Heads Up Cockpit with a Concentration Center Display.

    clip_image021

    clip_image022

    clip_image023

    Mazda Connect System

    Mazda also showcased the Mazda Connect system, enabling car communication and software components to be "easily" upgraded as new features are available. Whether this will be an insanely expensive solution, akin to Samsung’s upgradeable TV approach, remains to be seen.

    It’s fascinating to see how some of the most innovative products are coming from what used to be one of the least innovative industries: automotive.