Microsoft have talked about their idea of “Three screens and a cloud” for the future of computing. The idea is you have your PC (Laptop or Desktop, or Tablet once Windows 8 ships), your Smart Phone and your TV with an Xbox connected. And all your data is shared between them with the cloud… but it has made me think… Why only 3 screens? Why not 5?
The 5+ screens I am thinking about are as follows:
- The usual 3 suspects: Main PC, Smart Phone and TV…
- Your Watch
- A Tablet (secondary machine)
- Possibly your car in-dash screen (could be your smartphone or use your smartphone for processing and internet connection)
The idea of using your watch as a screen is something that is not exactly new. The original Microsoft based Spot watch used an FM signal to get data from the MSN direct service. This included weather, and if I remember correctly, also included calendar items.
In recent times, companies like Sony Ericson have developed their LiveView: a screen which connects to your Android phone using Bluetooth and shows information like phone status, weather, Facebook friends, and more.
If you look at the new iPod Nano, we are getting close… there are even watch straps for the Nano. Apple’s latest Nano came with extra watch faces since people wanted it! But the Nano is missing a couple of thing:
- Really needs Bluetooth! Not only for listening to music with, but also for talking to different devices… imaging if your Nano talked to your iPhone and got data streams like news, weather, phone call, email and SMS information, your next calendar appointment?
- Apps! It seems as if the Nano is running a micro copy of iOS, so in theory, it should be able to run iOS apps… slightly modified apps mind you. And I can’t see it having enough power to run Angry Birds, but basic info apps should be enough…
- Vibrate function: the ability to “buzz” when something happens… I carry my phone in an inside pocket or a pocket with other things, so I don’t usually feel it buzzing or hear it ringing… but if my “watch” did, I would notice!
- A proper docking station: I would like, before I go to bed, to stick my “watch” in a docking station, which both charges and syncs contents to the device. It would also sync with the dock, telling it what time I want to be woken up the next morning.
My car is currently a 2006 BMW 520D. It has the professional navigation system, which includes an 8.5” widescreen display in dash board. This is used by most of the functions in the car: audio, navigation, communications, in car information and climate control, all controlled by the iDrive. If your car could connect to your phone (mine already does for Bluetooth phone calls) to get information, data and more, it could be much more interesting:
- Any music on your phone available at your fingertips.
- Be able to call up data from the internet, giving you tips on traffic, your latest calendar appointments, friends’ check-ins, and more.
- Your car being able to send data to your mechanic over the internet for diagnostics information. It could also send data like millage, fuel usage, etc., so you can see how your fuel economy is doing…
- Apps, working both on your phone and your car, for operations like Navigation and Music.
BMW already have this with its ConnectedDrive which does a lot of these functions, but currently only works with the iPhone.
Your secondary machine
Most people have one main PC. Be it a big honking workstation, like the GodBox, or a standard Laptop or Desktop machine, it usually is something fairly powerful, which could be used for anything from Internet browsing and email, to syncing your music and videos to your devices (iDevices, Windows Phones, etc) to video editing and photo processing, to developing code. Either way, it’s a large enough machine with enough power to do what is required.
But, what about the second machine? The machine you would use for tasks requiring less power… this is where I think the likes of an iPad, Windows 8 Tablet or Android Tablet come in. these are thin clients, but still have a bit of power to do work on them directly. You’re not going to open your iPad and do coding directly on it (well, maybe you could…) but you may login to a remote machine and tweak code, or check how a build is doing. The “secondary” machine is perfect for this.
Think of it as a medium build machine. I mentioned the idea of a Medium Build Client when talking about Cloud Desktops a while back. My thought of it was around the Windows 8 Tablet. Since the Windows 8 Tablet can be a proper Intel processor, like the Build Tablet that was handed out at the Build conference, can actually run as a full Windows PC, and runs Visual Studio “11”, I see it as something that can be used as mostly a full machine, but also as a thin client for connection to larger machines.
So, what is the ideal Solution?
I don’t think we are there yet, so there is no solution at the moment, but what would ideal would be the following walk though:
- Be woken up in the morning by my Alarm clock, which is acting as a dock for both my Phone and Smart Watch. The watch’s time would be kept in sync with all other devices. So, you stick the watch on, grab your phone and tablet (which is also syncing with your main workstation) and head down for breakfast.
- Reading the news and checking mail on your tablet, your watch buzzes to remind you about a meeting in about an hour and a half. It knows it’s going to take you about 50min to drive to work, but traffic is a bit heavy, so it warns you to leave a little earlier. You finish up and jump in the car. Your phone and car talk to each other. Since your car knows your heading to work, it finds the route you usually take, and checks traffic information. It notices there are delays on your usual route, and decides to re-route you so you get in to your meeting on time.
- You get to the car park, leave your car and head to the office, a 5 min walk. While walking down the road, you’re listening to music on your Bluetooth headphones from your phone, and your watch buzzes. You have a quick look and it’s a text from someone coming to the meeting asking where it is. You hit the call button on your watch and have a chat with the client, giving them directions. When you get to the office, you hit the send location option on your phone and send them an email with your current location, so they can find you on their phone.
- During the day, you use your tablet to work, connected to an external screen, with a Bluetooth keyboard and mouse. 90% of your work is done on a remote machine in your datacentre, allowing you to get extra power and machines as needed. Your servers all live in the cloud, and your data is backed up regularly. All your code is stored in your code repo (SVN, GIT, Mercurial, TFS) and if your machine goes down, you can spin up a new instance, and get back to where you left off pretty quickly.
- At about 3 o’clock, you hit the slump and head to the local Starbucks to have a coffee and get out of the office for a half hour. You head down with the tablet, your phone and your headphones. Sitting in the back of the Starbucks, drinking your coffee, using their wifi and listening to music, you start looking through email and documents. You’re on instant messenger and keeping an eye on your bug tracker when you notice an important bug for a section of code you own coming in. Since you don’t have as much bandwidth to work on the remote desktop with as high a fidelity as you do in the office, you update your local copy of the code to the latest version, open your copy of visual studio, take out your travel keyboard and mouse and start coding the fix. You run your tests, check in, and a Continuous integration build gets kicked off… at this point you finish your coffee and head back to the office. On your way, your watch buzzes to tell you the build has completed and all tests have passed. You send a “deploy to test” message from your watch’s custom application, and head to your office. You get back and all is good with the build and the world…
- You’re about to head home after a long day in the office. You decided to chance your arm at cooking a Thai Green Curry. You check your tablet to see what is in the house and what is needed for the curry. You add the list of missing ingredients to your to-do list, which is synced with your watch. You head to the car park, hop in the car and hit the start button. As you are driving out of the car park, the car notices the shopping list. It finds a local shop on the way home, but also realizes you need diesel for your car. It adds the shop and finds the cheapest fuel station on your way home, to your navigation system, and directs you to the quickest route with the least amount of traffic.
- After filling the car with fuel, and as your heading to the shop, your car informs you that your Miles per Gallon since the last fill up was about 10% off your average. It notices your last service was nearly 12 thousand miles previously and suggests a service. You confirm and the car books itself in for a service, adding the details to your calendar.
- You arrive at the shop, and take out your phone. You pull up the shopping list in a custom application, which knows the layout of the given shop, and shows you where the items you are looking for will be. You pick up the items, scanning them with your phone as you go along. Your phone gives you a running total, and pops up with some offers as you scan items. “Did you know the shop’s own brand of this item is 25% cheaper?” or “you usually buy this every 2 months, and it has a shelf life of 10. If you purchase 2 of these now, you get a third one free”. When you arrive at the checkout, you tap your phone on the NFC reader. The assistant asks if everything scanned correctly, and pay using the NCF system in your phone. You place everything in a bag and head out to the car and drive home.
- Once home, you place the tablet in the kitchen dock. You load up the recipe and it starts talking you though what needs to be done. You miss a step, so you ask it to repeat the last item. While you are waiting for it to finish, you load up the Netflix application and see what is in your instant queue. You start looking though new releases and see a new film you would like to watch. You add that to your queue, and finish cooking your meal.
- You head to the sitting room, lights automagically come on, and you turn on the TV and Xbox. You tell the Kinect to load up Netflix and select the film you want to watch. It dims the lights, closes the curtains and starts playing the film. You sit back with a beer and your curry, and relax.
- After a few more beers, a very tasty curry, and an enjoyable film, you head to bed. You stick the tablet in its charging station, your phone and watch in their docks, and yourself into bed… it will all start again the next morning.
This idea is not all that “pie in the sky” as it might sound. We have the Technology, we have the ability, we just need to link it all together… it’s the magic glue that is going to need to be developed, and I am now just wondering who is going do develop the magic glue? Apple? Microsoft? Google? They all could do it, Microsoft probably being the closest, but the question is not who will get there first, but who will do it the most integrated?