On integrated experiences

What can the integration of hardware + software + ML + custom components achieve for a company? Experience gains for devices or services powered by this integration will provide the answer, either immediately or over a longer-term.

In recent news
– Apple launching M1 Pro and M1 Max. It’s “loaded with advanced custom technologies that help push pro workflows to the next level”. Some tech includes on-device machine learning, integrated controllers for better I/O, custom image signal processor for computational video and pictures, better security.
– Google launched the Tensor chip. It promises to keep pace with the latest advancements in machine learning (ML). It’s an attempt to move from an ‘a one-size-fits-all piece of hardware into a device that’s intelligent enough to respect and accommodate the different ways we use our phones’. It will power speech recognition, computational photography, security and much more.

Both talked about experiences that are not as effective in other technology choices.

The M1Pro and M1Max have custom components for the ProRes video. The iPhone gives you the option of recording and editing videos on a device in this format. The new laptops now offer hardware support with these processors. It should certainly speed up workflows for professionals working in this space. All edits, whether on Final Cut, Davinci Resolve or Adobe Premiere that use ProRes perform better because of the support for Metal API etc. So these chips are going to bump up the speed even further. The ecosystem is internal. Suppose these devices are custom made with creators in mind. In that case, there may be other devices on the horizon that are going to be more focused on other professional workflows and the combination of hardware, software and other custom components make for a big performance boost.

Google, with its Tensor chip, is focused on a similar path for their mobile phones. They have the android platform, plus machine learning and cloud infrastructure. The machine learning on the chip allows them to create unique computational photography experiences. So whether you are taking photos or videos, there is a level of computational ability that the company felt was not possible in earlier off the shelf devices.

Since these launches, I have been reading reviews and insights on these devices, but I was most curious about the response of the other technology companies who provide off-the-shelf components for computing and mobile devices. It is not as if other desktops or mobile computing options are not able to give that performance. Intel and Qualcomm supply to a large ecosystem of vendors. Compatibility with different combinations of hardware and versions results in them being accepted across multiple devices. For example, Intel recently introduced the Alderlake processors which are supposed to have comparable if not better performance.

Finally, the performance choices are power and compute choice, whether the customer will prefer these new devices or multiple devices. Another point to add is extended support where the company with the integrated experience has a better chance to offer a unique service experience as well

There is much to think about on integrated experiences.

Additional reading


More to imagine and build

I have been experimenting with the functionality of Metahuman Creator by Unreal Engine. This is still in the beta stage, but a few things are clear.

  1. The metahumans you see in the attached image are based on the sample models in the beta. You can do a fair bit of customisation. One can work on facial features, hair and body type. There are some sample clothes and colours to experiment with. I am sure more functionality and customisation would be possible.
  2. I started with a long and white-haired metahuman. I attempted to change features in an attempt to visualise a younger version of it. And the other one was a younger metahuman with short hair and how they would look with a bit more white hair. Not quite perfect but it still gives you an idea of the tool’s potential.
  3. The exciting bit is that you can take this metahuman (via Quixel Bridge) and put this into a scene in Unreal Engine and perhaps other platforms.
  4. You take it to the next level with a tool like iClone. Lipsync, motion capture and more, but that’s for another day.
  5. The use of this feature in gaming makes visual effects apparent, but there are clearly more scenarios.
  6. Learning experiences, pre-visualising realistic experiences and walkthroughs and linking back to other features are starting points.

People working on developing and visualising personas would find this creator very useful as well. More to imagine and build.




Note Taking Process and Apps

Change is the toughest to track. Sometimes the change is rapid, but other times it can be subtle and barely noticed. Yet, it may have made a big difference in the experience of someone.

Sometimes these changes take place in our own area of work, but there are more likely changes in other areas that can make a tremendous impact in our sector.

Should you track changes? Yes. Reading about them and having diversified a portfolio of reading interests is one thing but making notes about something interesting is equally essential. Since my school years, I have had a habit of reading but didn’t make as many notes back then, and this habit has grown.

Sense-making, my work in customer experience and whatever term you prefer. There is a lot to read, look at, understand and make sense of the customer journey for clients. For example, if the client has a business that engages with customers in stores, clinics, and many other formats, then there is a lot of visual information by way of messaging, equipment, etc. It is helpful to know for example, how digital signage is being used by different customers in various locations. Work-related reading could involve sector-specific insights, market research documents, books and more.

A Power platform application serves as a visual and related repository for the teams’ notes in market visits. It links back to Sharepoint, and everyone has access to the information they need. Another application was a business news tracker on specific topics that each of us works on. These apps were developed quickly and have served us well over the years.

Continue reading “Note Taking Process and Apps”

Children – Data Games with their Future

There are three data points to think about

  • Instagram is working on a version for children under 13
  • Proctoring applications are a lot more acceptable with educational institutions
  • The Metaverse and it’s inroads into the world of children

Let’s start with these though

Safety in the browser

  1. You would be aware of a new initiative called FLOC. It is currently in preview/ testing mode and will roll out soon enough. It is promoted, by Google, as a privacy protection initiative to third party cookies. Still, it has received significant push back, and most browser alternatives are refusing to support the standard and, for now, only available on Chrome. Will FLOC also cover Google Workspace (education) based students ids and what about Chromebooks used by children. Hopefully, the status is opt-out from the outset.

If you have concerns, then Edge is the default browser on Windows machines, and you can choose Firefox or Brave. Mac’s default browser is Safari, but all the other three browsers are available. Meanwhile, Microsoft Edge’s new Kids Mode has appropriate features and content for children aged 5-8 or 9-12. It limits the sites that children can go to, adds safe search and strict tracking prevention. The browser’s family safety feature is linked to a Microsoft account.

Continue reading “Children – Data Games with their Future”

A QR code in the Sky

Somewhere in Shanghai, there is a QR code in the sky.

There is the technical wizardry of drone operations that could create a QR code in the night sky. It then makes you wonder if this is needed?

Scanning conversations on Twitter, the responses included –

  • ‘future possibilities’ and applications
  • ‘scalable billboards and exponential infinity model.’
  • ‘beautiful’ and other terms for wonder too.
  1. Do we need the wonders of fingerprinting, ad-tech related in the sky as well?
  2. Nighttime light pollution has been associated with many health disorders plus the impact on wildlife. This kind of service may add to the problem and add to the safety and energy burden.
  3. There is a lot of discussion on the impact of nighttime light pollution on astronomy and the work of earth observation teams. Adding to this are the mega-constellations of satellites going into service in earth orbit. They will be visible from the earth.

Now imagine a child, fascinated by apps like Sky Guide, taking to a telescope. What will they see? A galaxy, comet, planet, human-made satellites or a QR code. Is there a correct answer for this?

Picture source : https://twitter.com/Pathfinder/status/1383491963068899336/photo/1