As I previously mentioned I've provided a copy of the new version of Waggle to everyone who purchased a copy of the original version. Overall the feedback has been really positive. I'd had some good suggestions for improvement too, and I do plan to work on this for a future update. In the mean time Waggle is now available for every to check out and download.
The new version of Waggle now has:
A new icon; I got rid of my mouse cartoon guy. I liked him, but thought the new version could do with a new icon!
A new algorithm for detecting Waggle’s.
A new animation; when a Waggle is detected the circles will grow and animate, then shrink and eventually vanish when you slow the waggle down.
A start on boot; by default, Waggle will automatically start when you login to your machine, you can turn this off by right clicking the waggle arrow in the notification tray and selecting “about” – there is a new dialog box with a check box.
A new window display routine; In the old version the circles would sometimes appear hidden behind other windows. I’ve worked hard to try to fix this. It seems a whole lot more reliable now.
I created waggle a couple of years ago to scratch a personal itch, loosing my mouse pointer on one or more high definition screens. The application was fairly simple - want to find your mouse pointer? - Then shake it from side to side. The application would draw multicoloured circles around the mouse pointer. Once you stopped shaking the mouse the circles would vanish.
I eventually released the first version of waggle. Several people downloaded the application. I reached out to a few and got some feedback on it. This led to me wanting to do an updated version. However the update got delayed a little. You see at the same time as trying to do the update I moved countries, changed continents, jobs, houses, cars, and got my kids into a new school. To say it was hectic would have been a complete understatement.
But this itch of wanting to do better has stayed with me, and now, I'm delighted to say that I have an update. This update provides the following:
Start the application when the user logs into their account
New waggle detection code
New animation code allowing circles to grow as you waggle and shrink when you stop
New window handling code to try to ensure that the circles always appear on top of every window on the screen
A new installer
I am sharing an early version of Waggle My Mouse 2.0 to some the folks who purchased the original copy, and I'm looking for feedback and ways in which I can improve it. Once I'm happy with it, I hope to provide to everyone who'd like a copy.
Finally, why circles and why those colours?
Well this is a question I have been asked about, well, the colours where picked when looking at my son's crayola crayon case. Thinking about the desktop, I didn't know what application would be running on the machine, nor what would be displayed, however it was important to show the mouse location. So thinking that the crayon case contained nearly every primary colour I used that as a starting point. You'll notice if you shake long enough nearly every colour available in crayon case will appear.
At first I wasn't too sure what to make of it. An email explaining that a mobile application I’d written had changed someone’s life. But it turned out to be the single best piece of user feedback I've ever had and probably the best email I've ever received. It all started with an ultimatum from my wife, and a desire to create an app which would “scratch my own itch”....
Our last post closed with the wish that Mattie would walk into his next physio appointment under his own steam – well guess what? Our rock star baby did just that!
At our last visit (May) he was doing well but he still wasn’t sitting up from lying down or pulling himself up to standing. His physio gave us some very simple exercises to work on with him that would further strengthen his pelvic muscles and really help get him standing and walking independently.
In the three months between appointments it was almost as if Mattie knew he had something to prove. Once he started to realise he could pull himself up the transition to cruising furniture and tentatively letting go happened so quickly.
On Thursday August 14th we returned to the physiotherapist. Our physio appointment lasted approximately 5 minutes. Mattie’s physio was delighted beyond words with his progress . He is progressing exactly as expected – his gate was a little wobbly still when she saw him but everything, even down to how he places his feet is excellent. There was lots of clapping, cheering and delight for Mattie at his appointment. At the end I am delighted to report that we were firmly sent on our way (with instructions to come back should we ever have any concerns) and officially discharged from physio!!!!!!!!
Even in the few weeks it has taken me to write this blog post Matthew’s confidence in walking has just escalated. Now, in the morning, he wants to put his back pack on – just like his big brother – and toddle out to the car for crèche. We just couldn’t be prouder of him! And the best thing... he does it all with the biggest, cheekiest smile on his face.
Our final remaining hurdle is our EEG and Neurology appointment on September 23rd. At this appointment we will review with neuro to determine whether or not we can start to wean Vigabatrin. This will be a nerve wracking one but if Mattie can approach it with a smile on his face you can be sure that we will too.
... and the fan stopped and the blue LEDs faded, little did I know it would be for the last time...
Eight years ago I bought my PC, at the time it was a beast of a machine, for those that are interested it was a whopping 1gig of RAM, an AMD64 CPU, and a GEFORCE graphics card. An awesome PC, it's had three operating systems and multiple uses, from a desktop work station to a gaming rig, and a pretty awesome one at that, until finally running as an Ubuntu desktop and server machine.
I bought it just after leaving my job at EMCC as a development workstation for my new role working at home.
The only real downside, and one of the precipitating factors leading to failure, was the fan. It was loud, when running it sounded like an aircraft taking off and was audible throughout the upstairs in my house.
It was the smell that hit me first, warm silicon doesn't smell great. During our current heat wave one of the PC's fans failed. The upstairs of my house were filled with the smell of hot silicon, the processor alone was reporting 85c!
It was definitely time to turn it off. It is sad to say good bye to the old boy, but it’s definitely time. For a PC it was a grand old age. It’s had an awesome innings. That PC has seen me get married, move jobs more times than I want to recount, move country, and had the arrival of two children. Its most recent job was as a photo repository, mini workstation, and it was converting some children’s DVDs to play on our AppleTV.
The King is dead, long live the King
The time has come to start the construction on a new machine, and for the first time since I was 1996 I've decided to build my own machine. Hopefully a much quieter one. If I'm lucky this one will also last me eight years.
Let's just hope nothing has happened to the memories currently locked away on the old boy’s hard drives.
The rules are simple. The stakes high, and the competition tough. Prove your street cred with your friends and family - will your chosen tune out last the dreaded 'bump' ?!
We've invented a new game. It was so much fun playing it that I felt the need to share it with the wider Internet. - To play you'll need an Apple TV, an iPhone or Tablet, the YouTube application and some friends.
How does it work? - Simple. Pick a decade, then search YouTube for the best tune on your iPhone / tablet. Once found use the Apple TV to project it for everyone to see. If it's good - it will stay playing, if it's bad then someone else will find another tune on their Apple device and will stream it to the AppleTV - bumping your selection off the tellie.
When we played the game bad tunes were with met with a chorus of "BUMP IT!", and good tunes were met with rounds of bad singing as we all joined in with the tune.
Its great fun, simple, and rather addictive. I hope you enjoy!
They say that hindsight is always 20:20. It's true. I'd been given 5 minutes in front of some of the most influential people and Europe to make my case, and 30 minutes after my nerve wrecking speech, I sat down and thought about everything I couldn't cram into the 5 minutes I had been given.
It's been a while since I shared the press release from the European Commission which highlighted the talk I gave and the work my team did as part of the EIT Young Leaders program. I owe you all an explanation of what I was working on, why I feel so passionate about it, what I said during the speech and what I wished I had the time to cover.
It was nerve wrecking as I stood up to join the other speakers on the stage, Kenneth Cukier, Economist editor and author of the big-data-book.com, Alfred Spector, Vice-President of Research and Special Initiatives at Google, and finally Gavin Starks, CEO of the Open Data Institute. I was nervous, these are big names and the audience was just as impressive, consisting of CEOs, CTOs, VPs, representatives from industry, EU Policy makers and European Commissioners.
I had been given 5 minutes to talk about how Big Data can be used as a catalyst for social change. Months of work, research and internal team battles turned into just 5 minutes. It is an eternity when you’re standing in front of an audience, but in reality 5 Minutes is just less than three paragraphs of text. That's not much to covey everything the 6 of us, on our team wanted to say.
The Unrealised Opportunity
When you speak to advocates of big data they all preach about how the computing power of today is cheap, digital storage is cheap and we've lots and lots of information about everything we all do every day. All we need to do, they say, is to study the data to see the great advances we can make. The problem is, when you study Big Data, to see lots of advocates, but not as many success stories. Why? What's wrong with the Big Data industry? Why aren't these fantastic opportunities being realised and what can we do about it? - This was the question which led the team and I on a fantastic journey.
Investigating the Obstacles
Let’s depict a vision of the future; a vision we want to create. Picture a future beach, with this little boy. A little boy who has less chance of suffering from Cancer than anyone today. A little boy who has less chance of suffering from Diabetes than anyone today.
In the future we've taken 25 years of shopping habits for over 1 million people and combined it with health records to reveal lifestyle and dietary choices which increase the risks of diabetes and cancer. We've shared this information with this little boy and his parents and allowed them to change their lifestyle to avoid many of these risks.
This little boy also gets more time with his parents than anyone today. Because his parents spend less time in traffic than we do. They have a smartphone application which provides them with a forecast of road traffic before they leave the house.
Now we have the vision, my team tried to create it. Let’s start with the traffic prediction application, how could we build that?
Building the Future
To build this application we need access to energy consumption data. The application combines electricity consumption data with traffic. As we all wake in the morning we turn on TVs, Radios, Kettles and showers. This causes a spike in the amount of energy that we use. This spike occurs between 1 hour and 30 minutes before we leave our house. This can then be used as an indication of when we will leave home and therefore what future traffic will be like. This electricity data exists, we know this - because the energy companies capture it to bill us, and the energy distributor’s capture it to ensure they supply the right amount of power to the right places at the right time. But it's not freely available - in fact it is locked away.
There are initiatives which try to unlock this data. Schemes like Green Button in the US or MIDATA in the UK which allow individual users to download their data from energy suppliers. These initiatives are focused on providing the data back to the individual. Using these initiatives to power our application we would need to contact each house hold individually and ask for their energy consumption data. For a city like London this could mean asking 3 million households for their data.
As an entrepreneur I've a lot of ideas and very little cash, and asking 3 million people is expensive, and risky. There is a good chance that not everyone will respond - I won't have all the data I need, and contacting 3,000,000 households is very expensive.
What if we could just ask the companies which already own this data? - The energy supply companies and the energy generators. It turns out that there are several issues making them reluctant to release this data:
Competitive advantage; what if we shared their data with a competitor. They would know who their biggest customer is?
Privacy issues; the energy company customers have provided the data only for billing and not for anything else, data should only be used for the purposes disclosed when it was originally captured.
Currently Open Data Initiatives focus on releasing data for free in a hope that this will spur innovation, and kick start ecosystems. But we haven't seen a huge uptake in this area. There are a number of factors preventing this:
Data reliability: The data is supplied for free. Usually this is a best effort delivery. That sounds ok, until you realise that in order for this data to be used commercially it is important to have confidence in it. Essentially the open data community is asking developers to take a leap of faith, to trust their livelihoods, homes, and families will all be safe and secure based on the income generated by a best effort data release. That is a big ask.
Timely data: Open Data Initiatives often partner with data producers and manually scrub data of information which may contain individual data, or other commercial sensitive information. This process takes time and as a result a number of the open data Initiatives provide "canned data" from historical data sets. This limits the applications to which this data can be used. We couldn't create a real-time traffic prediction application on this type of historical data alone.
What if we were to invert the question?
Privacy is also a concern; we don't want nor need to pry into an individual’s details, in fact for our app, seeing an individual’s energy consumption data is next to useless. It is like looking at a grain of sand when what we want to see is the beach.
Big Data is by definition big, getting a copy of this data is expensive and slow, and we don't need it. While computing is getting cheaper, and cloud computing is even more efficient, it is still not free and big data requires lots of it.
What if, rather than providing a copy of the data could simply get an opportunity to do some statistical analysis of the data. The data wouldn’t move it would stay within the owning organisation. We could create a software infrastructure which ensured that access to the data was safe and privacy compliant. It would reduce the cost of access for a small start-up, and address the privacy issues and mitigate against the risk associated in disclosing data seen as a competitive advantage.
The data would be safe, but the statistical value of the data would be set free. We would liberate the value of the data.
Free as in Beer (which you have to pay for)
What I'm going to propose sounds, at first a little nuts. I think free data isn't the right thing to base an ecosystem on. In fact I think it discourages the ecosystem from taking root. As mentioned above there are a number of problems with free data which prevent developers from taking dependencies on it. One of the best ways to address is this to pay for the data. Once money swaps hands, SLA (Service Level Agreements) can be put in place and if there is an error in the data, or it is provided late - then there is a rout of recourse and the developer can chase the data provider for recompense. This shared risk and charging model allows the ecosystem to grow. It encourages new data providers to enter the market and allows developers to more confidently base their family’s future on the data they provide.
This charging model also has the opportunity to disrupt existing market places as it provides additional revenue and a new way for businesses in existing ecosystems to generate revenue. We considered the energy market place in Europe. Drawing the value chain for this market place we see the following:
Power generation companies which sell their energy on via power transmission companies to a Power Distribution Operator (DSO) the DSO in turn supplies the end user via the retail companies we all subscribe to.
For the purposes of our traffic prediction application it is the DSO which possesses all of the real time information we need. The DSO needs to load balance its network to ensure that the right customers get the right amount of power at the right time. To do this the DSO has real-time, live information about power consumption. However the DSO never gets to communicate this information to the end user.
Getting the DSO to release this information, even via the statistical analysis method described above would not be trivial. There are costs associated with it. To deal with this our team proposed the creation of a data broker. The broker could amortise the costs of the technology across a number of data sources. It would provide the marketplace for data services from a range of different industries and it would provide the data providers with additional revenue.
Adding this additional revenue stream into the originally presented value chain we get this: A situation where the DSO is generating income from two different sources. Of course similar additional revenue streams can be obtained by all of the energy companies in the original value chain. This could change the relationships between each of the companies and disrupt the current status quo.
As the broker expands into new industries it will create a market place of data producers which compete for data consumers. This competition should help ensure a low enough price point for the data. Keeping a low price point is important in order to encourage entrepreneurs to really become involved in the market place.
The key to obtaining this future – the future on the beach, is about allowing companies or individuals to come up with ideas, test them, fail quickly, or succeed in a big way. In order to do this we need to create a working ecosystem in which they can experiment. This will provide society with insights and benefits beyond what I or anyone else can outline, and will really make the vision of the boy on the beach a possibility.
The real beauty of Twitter is the ability to discover cool people, with shared interests and likes. It's like finding that cool group of people in a crowded party. Meeting someone new and really getting to know an area they are interested in and sharing their passion.
Now a music application which lets you do that, that would be cool. It is what Twitter's new music application could have been.
Instead we've an application which provides a bland pastiche version of a generic radio station - one that plays the over hyped, almost factory farmed tunes. - Its like being back at the party but not being able to talk to anyone because of the booming music played by the host.
Now if only there was a cool way to share great new music with those cool guys in the corner ... maybe sound cloud has the answer?