A coffee-loving software developer.

Three weeks since last one! I was fully expecting it to have been two weeks since my last “weeknote” but I guess been too busy to notice.

The UK has gone back into lockdown again for COVID-19. Although this time seems not many people are taking much notice. Schools are still open. Any restaurant or cafe that does takeaway is still able to stay open. So all the chain coffee shops and fast food places are still open, just with seating areas cordoned off. Unlike last time, car traffic is still at a normal level. It hasn't really made any noticable difference to my day-to-day life at the moment, as we've been pretty much staying at home anyway.


I've been busily working away on the next version of the sync algorithm. This has been quite frustrating as I've had a number of good ideas that didn't quite work out.

How do I know they didn't work out? Because I've actually created a test harness to test them now. I have taken a sample of about 50 pieces created with Choirless and written a test suite that will run each of them through the synchronisation algorithm to test how well it does. It produces a tile image of all of the tracks synced using 30, 60, 120 and 180 seconds of the audio file.

The reason for doing this is that we want to have Choirless send up just a small section of the song to test sync whilst the user is using it so that it can align the audio locally whilst they are listening. In the output above (which is just a small fraction of the total output) one of them (3rd row) has failed, and highlighted in red.

This not only allows me to test how well changes I make to the sync algorithm work, but enables me to check we don't have any regressions in it. ie, that I don't make things worse.

More importantly though it allows me to tune the algorithm with something like Optuna, which I describe in the Twitch steam I did a few weeks ago. In short, Optuna allows me to automatically run the test suite over and over testing different parameters to it to find the ones that work best. And by “work best” I mean, successfully synchronise the most songs in the test suite.


I also wrote a post here on Coil about how Choirless is using an algorithm that was developed 50 years ago and first used on the Apollo 11 moon lander. So Choirless is literally rocket science now.

I have been busy today setting up to record a video for a series called “Developer Diaries” at IBM all about Choirless. It has been a very interesting experience due to COVID-19. The IBM Originals production team in the US have sent me all the equipment, a large pelican case filled with iPhones, iPad, lights, microphones, etc and several boxes with tripods and light stands in.

I've spent the day setting the equipment up in my office, under the direction of the team in the US. They are then able to connect to the iPhones used for recording remotely and control the actual shoot itself using a platform called Openreel. They can remotely control the cameras and set the exposure, focus, audio levels etc. It really is pretty neat.

Tomorrow we do the actual shoot for the video, and hopefully it should be released in January.


My desk, lighting, streaming setup is now complete. It has been working really well. Since the photo below, I've cable tied all the cables up nearly to the poles and glued down the sheets of paper I'd taped over the lights to test additional diffusion.

I've actually been asked to do a write up of the setup for the IBM Developer blog. I want to get some of the making-of posts about the lights and the frame written first though so I can then reference them from the post about my setup.

It has been really nice to be able to simply hit the switch on my sit/stand desk to raise it to standing and have the lights come with me.

IBM Data & AI Digital Developer Conference

Last week I organised the “hallway track” on the IBM Data & AI Digital Developer Conference. It was a total of 6 hours, and I hosted it along with two colleagues, Sean and Upkar. We each had a two-hour long slot and had a mix of 30-min and 15-min sessions with each speaker.

This is a format that we've tried before, and found works really nicely. The main talks of the conference itself are pre-recorded and released on-demand on the day of the conference. But being pre-recorded means you can't chat to the speaker or ask them questions. So the “hallway track” or “watch party” was run on Twitch on the IBM Developer Twitch channel in parallel.

This time it worked even better than last time. We ironed out some of the technical issues we had, and used Streamyard (a hosted service) instead of OBS, which made it much easier to pull in remote speakers.

We had a fantastic crowd of people, and most amazingly the viewer level stayed pretty much constant right the way to the end.

The feedback from within the business has been fantastic, and I think this has proved itself to be a really effective approach.

You can watch the full hallway track on Twitch:


Why I stream live coding on Twitch

Speaking of Twitch. I was invited by a group in IBM called the Technical Consulting Group (aka the Totally Cool Geeks) to give a talk about why I do live coding on Twitch.


The full video is above, and you can read a write-up about my process and reasoning here. But in short, the whole point it to build a better connection and empathy with developers. With the live coding sessions you don't just see a polished demonstration, but you get to see the actual process. You get to see all the bits that go wrong as well as right. You get to hear my thought process and why not just what I do. You can read more thoughts in the link above.

Oven Hack

Oh, forgot about this one, so just adding it in as I'm quite pleased with how this turned out. In our kitchen we have an electric double-oven:

All of a sudden the display and central control panel stopped working. The over is annoyingly 2 months past it's 2 year warranty. So old enough to to be fixed under warranty but new enough that I really don't want to replace it. Especially as we are hoping to completely refurbish the kitchen next year and will get all new appliances then. We could limp by on just the smaller top oven, but we do quite a lot of cooking in this house and use it a lot. Also with Christmas coming up (even in its reduced form with less family, due to COVID distancing) we will need both.

The display stopped working, as did the bottom (main) oven. The lights still go on, and the fans whirr, so it still has power. It could have been an element blown, but that wouldn't explain the display going. I'd expect the display to come on, but indicate a fault. Strangely the top oven did still work.

So I pulled it out and had a look, thinking maybe the circuit board had gone. Sure enough there was an ominous looking burned resistor.

And taking a reading with the voltmeter and there was nothing going over this resistor. But the voltage on one side was 54vdc. Way too high for a circuit that should be 12vdc according to the big relay on the centre of the board.

The resistor is so charred I can't work out the value of it, so couldn't replace the resistor. And besides, I still don't know what caused the voltage to be 54vdc, something else has obviously gone wrong.

As with many things these days a new circuit board would cost about 50% of the cost of the entire oven, which just annoys me.

Then the realisation... There is only one set of wires to the circuit board. They are 230vac wires in and out. And looking at the board, they are connected to the relay. It dawned on me that all the circuit board does is engage/disengage the relay based on the clock on the display. So you can set a timer for the over to go on or off at a certain time. A feature we have never even used. This explains why the smaller oven still works, as it does not have timer control. It is just a fancy inline switch.

So my hack...

Yup. I just cut the plug off the end of the wires to the controller board and just connected them together, bypassing the board completely.

And we now have both ovens working again!

Well I think that's all folks. Take care and hopefully won't be so long until the next weeknote.


Why I Stream Live Coding on Twitch

Fifty one years ago the famous words of Neil Armstrong were heard around the world:

one small step for man, one giant leap for mankind

Man had landed on the moon. The Apollo 11 lunar module had successfully descended to the surface of a lump of rock a quarter of a million miles away.

Onboard the lunar descender was a guidance computer that calculated the descent trajectory to the landing site. One of the key pieces of information it needed was the position of the lunar module itself in space. This position information was taken from a series of four Doppler radar stations on earth. The problem was, how to tell that the readings were accurate? These readings were used to adjust the onboard systems. If either reading was too far out then the mission would have to be aborted.

Enter Rudolf Kálmán, a Hungarian mathematician. He had created an algorithm that could take a series of measurements from multiple sensors that might contain statistical inaccuracies and combine them to give a single, more accurate reading. On a visit to the NASA Ames Research Centre an aerospace engineer Stanley F. Schmidt realised that this algorithm, the Kalman Filter, could be used for the Apollo navigation computer.

Since then, Kalman Filters have been used in guidance systems from nuclear ballistic missile submarines to the navigation systems of the Tomohawk cruise missile. They are also used in the navigation systems of the craft that dock at the International Space Station.

So what has this got to do with music?

I have been working on the synchronisation algorithm for Choirless, a website that allows people to sing and play music remotely. Each person plays or sings their part, and Choirless combines them all together and gets them all in time with each other.


The magic behind the scenes analyses each audio waveform and extracts from it certain 'features' of the audio. These features include the rate of change of frequency (spectral flux), the rate of change of volume (crest factor) and the actual dominant musical note (chroma). These are used to try and measure how similar the audio files are at different offsets so it can determine the best alignment such that everyone is in time.

We take many readings from each of the features above, taking multiple attempts at calculating the synchronisation of the muscial piece with the reference piece.

In the chart above each of the faint red, green and blue lines represent a single attempt to synchronize the music. The horizontal access represents the offset from the reference piece. The vertical axis represents a measure how different the two pieces of audio are at that offset. So the idea is to find the lowest point (the least difference). In the chart above, the offset which fits best, is at 81 milliseconds. That means if we trim 81 milliseconds off the start of the audio file it will be in perfect time with the reference piece. But if we have multiple readings for the same thing, how do we work out which is the 'true' reading (the solid red, green and blue lines)?

A Kalman Filter.

So an algorithm developed sixty years ago and with its first major application in landing man on the moon, is now being used to synchronize music together to enable people from across the globe to sing together.

Cover photo by Nong Vang on Unsplash

My dad always said to me as a kid “bullshit baffles brains”. And that saying recently came back to my mind yesterday. There has been a whole lot of misinformation around the internet recently regarding the US election. This is not a new thing, the general trend started in the run up to Brexit four years ago, and has accelerated immensely. Of course there has been misinformation back through time as long as there have been humans, but the current scale is at a level that, I feel, is one of the main threats to society now.

I believe that “bullshit baffles brains” is a military term. Which stands to reason as my father was a Red Beret (Paratrooper) in the British Army.

Yesterday I saw some tweets going around mentioning something called “Benford's Law” and showing two charts, one of which supposedly showed “fraud” on behalf of the Biden votes in the recent US election.

I'd not come across Benford's law and so took a bit of time to look it up. I am not a data scientist, but I am very interested in data science, and as a job I help explain data science and machine learning to software developers.

Immediately my alarms bells were going off. Why would the number distribution follow such a pattern, and under what circumstances?

Most natural statistics of numbers follow what is called a “natural distribution”. Take a person's height for example. The average height for a male in the UK is 5ft 9in (175.3cm). But you know people shorter than that and tall than that. But the further from the average you get, the fewer people you know. If you were to plot the number of people at each height this results in what is know as a “bell curve”.

Benford's law states that for certain distributions of numbers that the first digit of the number is most likely to be 1, then 2, then 3 etc. This seemingly odd conclusion comes about from the combination of several statistics with natural distribution. It was originally noticed and used to detect fraud in accounting data.

But what about elections? In the tweets above they plot the frequency of the first digit of the number of votes each voting precinct got in the US. The main problem with this is that those numbers are actually quite small and as a result don't comply with Benford's law (which only works when you have numbers that span many orders of magnitude). Most voting districts have similarly sized catchment areas by design.

Taking this to an extreme, imagine that all districts had between 30 and 90 people in them. Then a chat plotting the frequency of the first digit of number of people in the district would have no results for 1 or 2 as there are no districts with a number of people that starts with a 1 or 2 in it.

Here is a fantastic video explaining it in a far better way than I can:


The key takeaway is that whilst Benford's law can show some evidence of numbers being “not normal” it does not necessarily point out fraud. You then need to look at why those numbers are what they are. And in the case of the US election it appears to mainly come down to a fluke of the distribution of the number of people in each district.

In fact there have been papers published that actually show that Benford's law is not very applicable to votes and elections. E.g this one from Cambridge University in 2017, whose abstract states that it is “...problematical at best as a forensic tool and wholly misleading at worst.”

And so onto the second blatant attempt at bullshit baffles brains:

I've come across this chap before. He tries to trade on some false credibility about being the “inventor of email”. I've already covered Benford's Law above, which is how he relates to this mess. So I've dealt with the substance of his claims, but now I'd like to move onto him himself.

The “inventor of email”. He has an entire web site dedicated to trying to uphold this charade:

Of course, this is a complete lie to begin with. He claims as a 14 year old boy in 1978 he wrote a program called “EMAIL”. However engineer Ray Tomlinson created the first email program in 1971. Tomlinson coined the term “email” and first used the @ sign as a means to delimit the user from the machine they were on in an address.

But as with most successful conmen, Ayyadurai take a kernel of truth and then adds layers of misdirection to it. The most obvious of this is the claim he has a “copyright on email”. To many people that sounds pretty convincing. And it is actually true. He has a copyright issued by the US Copyright office. But what for? First let's look at various common intellectual property (IP) protections:

  • Copyright: an intellectual property protection on a creative work (story, music, etc)
  • Patent: an intellectual property protection on an invention, method, device, mechanism, etc.
  • Trademark: an intellectual property protection on a term, logo, name, etc

He has a copyright. But you can't copyright a concept, or a word. So what has he got? Read closer and he has a copyright on a program he wrote, called “email”:

In most jurisdictions copyright is automatically granted to the author of a piece of work. ie this blog post is automatically copyright myself. Do you know what else I have copyright on?

An email program.


Granted, it is not a very good email program (or VERY good depending on your view of email). But it is a program. And it is called “email” and I have automatic copyright on that code as I wrote it.

So what does it mean having copyright on “email”? Nothing really. It is a completely nonsensical claim. It would be like Charles Dickens saying he has a copyright on Oliver Twist, and hence invented the “novel”. Apart from Dickens is dead, and copyright automatically ends after a set period of time, and the works become public domain. Usually a number of years, e.g. 70, after the author of the work dies.

If he had a patent on email as a method/process then he could prevent others from implementing something that does the same thing. If he had a trademark on “email” he could prevent others from calling something “email”. But he has a copyright, which just means others can't copy the specific code he wrote 42 years ago. Code for a concept that someone else invented 7 years earlier.

But the point is, he tries to use this nonsensical, but true, claim of having a “copyright on email” to give weight to the lie that he is the inventor of email. This in turn has no doubt led to a number of other deceptions... ending up with him claiming some authority in his lie that Benford's Law indicates Biden's votes were manipulated.

And are people fooled by this charade? Yes. They are. The inventor of email has mathematic proof that Biden cheated. Sounds very convincing.

But none of it is true.

What Role do You see Developer Advocacy having in Developer Happiness?

Exploring Bias in Crime Data

Intro to Machine Learning – The Titanic Dataset

Time flies again! So what's been going on in the last week and a bit?


Lots of Choirless stuff this week. I was interviewed on BBC Radio Bristol on Sunday morning. It was great fun, but alas with only 5 minutes could only cover a small fraction of all the great stuff going on. They did play a snippet of the pupils from my daughter's school singing though. So that was awesome, as it means Choirless has come full circle from and idea inspired by her to a real thing being broadcast on the radio.

I've been trying to get back into doing some more development on it, but kept getting distracted by all the attention it is getting. Every day we get 3-4 emails from people wanting to join. Every single email tells a story. I would love to be able to find a suitable way to publish them all as they encapsulate so much what we have been doing. Here is an example:

This sounds amazing! I'm a Bristol-based Musical Director and lockdown has been immensely frustrating to say the least. I worry that my orchestra and choirs could disappear completely if we have to use Zoom for much longer.

I'm talking to a number of parts of IBM marketing who are hopefully going to be able to use Choirless for the basis of some videos. Likely focussed on the developer side of making Choirless, but also in discussions about getting a band to do a performance with it and record it all from start to finish in real-time. So, stay tuned!

DEG Book Club

A colleague has started a the DEG (Developer Experience Group – the business unit I'm in) book club. The first book we are going through together is The Developers Guide to Content Creation by Stephanie Morrilo. So far we've only gone through the first couple of chapters, and I've got some exercises to do. But I have to say, so far, I've found the book excellent. If you are a blogger or technical writer it is well worth it.


OK, so this a a silly thing. A colleague of mine wrote a typo when talking about a conference, writing: “...loved the whole experience as well as how their advocates connect with you while screaming too”. [she meant 'streaming']

...so I'm now the proud owner of the domain name screamfest.com. And I'm going to do it. I'm serious. I'm going to pick a date and organise a 1-day conference in which everyone just screams. No mute. Just screaming. There will be a CfS (Call for Screams) in which people will be able to submit their proposals for screams. Each session will be 5 minutes long. But so far we've had ideas like:

  • Multilingual screaming
  • Screaming in LGBT+
  • Panel screams
    • XRP vs Bitcoin
    • Vim vs Emacs
    • Brexiteers vs Remainers
    • Democrats vs Conservatives
    • Toilet seat up vs Toilet seat down
  • Kids screaming
  • Pets screaming

I'm thinking this could be a charity event. Maybe we'll sell tickets for £5 a ticket with proceeds going to charity. Not sure what charity, anyone got any suggestions?

We will need a keynote screamer. I did check to see if the original Wilhelm Scream guy Sheb Wooley was still alive, but alas not. Someone has suggested Jamie Lee Curtis. Or maybe we could get the Finnish Screaming Men's Choir. And stress test Choirless at the same time.


I mentioned before about wanting to get a mechanical keyboard, and get my working environment sorted out a bit. Well I was inspired by a post by another developer advocate, Cassie, who made a pipe-mount on her desk for lights and monitors, etc. I have been making up some lights for it:

These lights are to replace the large softboxes on floor stands I currently use. They are my DIY version of something like the Elgato Keylights.

I've bought the pipe and fittings to make my own frame for my desk, and they've turned up and I dry-fitted it:

They pipes are 34mm (1” nominal) pipes and fittings from a place online called Keyclamp Store. These fitting are usually used to make handrails and climbing frames and the likes. The pipe is slightly overkill for the weight it will be taking, but at least it won't wobble, and also it is the same diameter as the poles used in multi-monitor stand arms. So my existing monitor arm can be fitted to it.

The light panels I made will fit to the top of it, I'm just waiting on some clamps to arrive to attach them, and also some threaded wood inserts and bolts to bolt it to the wooden desk surface. The desk is a motorized sit/stand desk and the idea is that I will be able to easily move between sitting and standing and my lights, camera, etc all move with the desk as well and stay in the same position relative to my face.

I am trying to decide if to keep the raw industrial look of the plain steel, or whether to paint it, and if so what colour?

Streams / Workshops

I'm a bit behind in writing up and publishing the recordings of the Twitch streams I've been doing. But last week's stream on object detection using YOLO (You Only Look Once) worked out really well, so I hope to get that published soon.

Tomorrow I'll be doing a stream on using IBM Watson Text to Speech conversion, to start creating a site that can convert an RSS feed of blog posts into a podcast feed of audio. You can join me on the IBM Developer Twitch channel at 2pm UK time.

Take care!

This sunny Sunday morning I was interviewed live on BBC Radio Bristol about Choirless, the Call for Code project I've been working on for the past few months. This was a short interview just over five minutes long that started with a clip of the pupils from Wallscourt Farm Academy singing a song called “Let's Harvest” in celebration of their harvest festival.

This was a particularly landmark event for myself as this was Choirless coming “full circle” back to where it started. The original inspiration was to allow my daughter to sing with her school choir virtually during lockdown. We had a few of the kids test out the very first prototype we had, but this was the first time the school had officially use it. Whilst the pupils are physically back in school again now, they are restricted to 'social bubbles' of about a dozen pupils to minimise any risk of spread of COVID-19. So each 'bubble' recorded separately and Choirless merged them all together. In total there were 400 kids singing!

You can listen to the snippet of the performance and the interview itself on the BBC Sounds website. You will need to register for a free account to listen to it. Skip to 02:51:52 for where the interview starts.


Or here is a transcript:

Jonathan Ray: Yep indeed during these difficult times just have a listen to this.

[clip of pupils singing plays]

The pupils of Wallscourt Farm Academy in Stoke Gifford with a song especially for their recent harvest festival. What's really great about it is it was recorded using a new website called Choirless and Matt Hamilton the creator is on the line. Good morning Matt! Thank you very much for your time on BBC Radio Bristol, it’s good to talk to you.

Matt Hamilton: Good morning! Thanks on having me on.

JR: Yeah, so you are the parent of one of those little ones just singing.

MH: Yes I am, yes, my daughter Sarah was the inspiration for Choirless because she wasn't able to sing in the school choir whilst they were all studying from home, and yeah so that's why I decided to build Choirless.

JR: So that's amazing. Obviously you are quite good technologically speaking, and and you’re quite web savvy, because you can’t just knock these things up in a matter of moments unless you know what you are doing.

MH: No indeed I mean this was developed as part of a competition called Call for Code. I work for IBM and they have a huge great big competition with about 400,000 developers across the globe in about a 179 countries. People came together to produce solutions to try and produce solutions to some of the big problems like climate change, and this year, COVDI-19. And so, as part of that, I submitted Choirless as an entry into that developed it as a part of that. [note: I should have mentioned here, that this has been built by a team of myself and colleagues Sean Tracey and Glynn Bird]

JR: Well you said your daughter was the inspiration for that, I mean, did you consult with her on what steps to take?

MH: I did I did a little bit. As with most nine year olds, she was very excited and then got very bored with me talking non stop about it. So we had an amazing interesting we've had over 200 choirs and bands contact us already to to use it. We've had choirs for blind kids in Scotland and a 150 piece school marching band in the U. S. we’ve had all sorts — churches, a lot of churches, coming coming to us as well because they are not able to do that sort of normal Sunday sing along.

JR: So, how does it work?

MH: So what happens is each of the band or choir leader records their first part, the lead part, and then invites everybody else they want to join and collaborate with them and they get shown and played back that part and they sing along to it. So it's almost like they do a duet with the with the leader and all the parts are brought together and using the power of IBM Cloud and A. I., it joins all the parts together. About five to six minutes later out comes a full piece. So you just heard 400 kids from Wallscourt Farm Academy singing there.

JR: Amazing because we've seen a lot about how social media and things like Zoom and Skype and the rest of it has brought choirs together in some some kind of shape or form what's next Matt? Is it world domination?

MH: Well that'll be lovely, yeah, that's what we are working towards! So we are expanding it out and have more features, we're adding into it as well to allow people to more control over the things that they’re producing and just getting more people using it and that's the sort of the main focus at the moment.

JR: You're getting sort of tweaks — you’re tweaking it as advised by people who use it who could give you suggestions — how about the web communities? Has it being taken on board from a wider field, perhaps? You know, has it attracted the attention, I mean you know, of Elon Musk or someone like that? I mean because it does sound quite sort-of revolution in some ways.

MH: Yes! Not Elon Musk yet. All the entries into Call for Code, everything's based around what is called Open Source Software. So the idea is that the software is freely available for anybody to contribute to. So we're open to other developers to get involved and to come along to contribute as well to the project. I’m looking for, as well, for musical partners — I’m not sure how musical Elon Musk is, but you know if we can get musical partners as well, you know..,

JR: It was the first thing that came to mind Matt! I mean I know it's a bit far fetched, you know ahead of itself but having said that you probably didn't realise you would be where you are now perhaps with this?

MH: No, it's it's been amazing we got to within the the internal IBM challenge for Call for Code, we came second place. It was a world wide challenge and it was presented as a big global gala hosted by CNN’s Van Jones just recently along with Chelsea Clinton , the CEO of IBM Arvind Krishna and it was… just yeah, it's been amazing the reaction we’ve had to it.

JR: Really, wow! What a shame you can’t hob nob with those big names you've dropped in person, you’ve had to do it via online as it were.

MH: Indeed!

JR: But, how do we access this if we are interested in using it.

MH: So if you to choirless.com

JR: Yeah.

MH: You can you can go on there and there's a link that to registering interest. Soon we're gonna be opening up to absolutely everybody, but the moment you can register and we’ll send you an invite. We are just slightly controlling the the flow in so that we can fix any issues that come up as we go along because each musical style slightly different. So, you know, we'll try with a choir and it might work; we might try it with a reggae band and it might not work quite as well. So we are tweaking it as we go along with with each contribution comes in.

JR: Brilliant Matt! Lovely. Good stuff, well done! it's a great innovation in difficult times. Good to talk to you.

MH: Thanks a lot.

Call for Code Awards Ceremony

So tonight is the night! In about an hour and a half the 2020 Call for Code Awards ceremony and celebration starts. I know I bang on a lot about this, but it really is pretty awesome.

So I have about 2 hours to go... it starts at 00:30 my local time, but I'm staying up to watch it live :)

Call for Code is an initiative to get teams around the world to collaborate on trying to use tech to help solve some of the big problems that face our society. Previous years this has been about climate change, but this year there was also a COVID-19 track. I came up with the idea for Choirless, to help people sing and play music together remotely. We came 2nd place in the internal Call for Code competition for IBM employees.


Tonight we get to celebrate the participants of the external competition, and find out who won. You can read about the 5 finalists here. From an app that calculates the carbon footprint of purchases in your online shopping basket, to an app to help people queue safely during social distancing, to an app to help create lists of assignments for remote schooling.

There might be a clip of myself and Choirless in the event, but I've been told that the editors have been cutting down... so who knows!


Last week we (the UK and Ireland developer advocacy team) hosted an event called “Code@Think” which ran after the “Think UK and Ireland” conference that IBM were running. I'll be honest. It didn't go well :( I mean, the actual talks themselves were delivered fantastically. But the whole event was let down by the platform it was run on just making the experience pretty bad. And the number of attendees reflected this.

This has mainly come about by a bit of an internal tension between developer advocacy and marketing. Not helped by the changing environment we are working in with everything online.

A few examples of parts that were problematic:

  • Multiple websites. We had our own website of the event which had a much more coherent feel to it and made the information much more accessible. The event platform itself (which we were require to use) had its own site, which everyone was directed to which made it much harder to see when each talk was happening.
  • Separate 'events' for each talk. Each individual talk was a separate WebEx Event. The main conference site system did not direct attendees to or let attendees in until literally the start of the talk. This is a bit like locking the door of the lecture hall and only opening it when the speaker starts talking. So the speaker was starting talking to an empty room. This was pretty demoralising to the speaker and not helpful to the attendees who miss the first few introductory minutes.

    Then, when one talk finishes. Everyone is booted out. Then they have to actively go back in to the next talk. A very poor experience, and hence much much lower number of people in the talks that we'd hoped.

    A much better experience would be like that we currently run on our IBM Developer Europe Crowdcast channel in which you can turn up before the talk starts and be presented with a holding screen and count down. The speakers can get themselves ready in a “green room” and then go live at the start and everyone is in place.

  • No “hallway track” or social aspect. How can you engage with the attendees if you can't chat to them? Yes there was a chat function in each talk, but it was totally isolated. So there was no way to stick your head out into the hallway and make an announcement, or to chat to others socially. I wrote a blog post on Linked-In about my experiences with doing this at a recent conference we ran where we had a “hallway track” on Twitch.

PayID Hackathon

I mentioned last week about entering another PayID hackathon. Alas the gods were not shining on me and working my socks off at the last moment and Github went down for two hours :( So I couldn't actually test or create the video to my entry, so it never made it through to the actual event. But you can read about it. Or watch a quick demo video:

...speaking of videos. Cinnamon has had a big update to the site and how it works. I'm still getting used to it. Not sure I like it or not. Or just the usual “change sucks” thing. It used to be when I went to Cinnamon I instantly saw videos of people I follow, now it takes me to all videos. Also it seems to have some functionality about 'shorts' that I don't fully understand yet. But each video I uploaded I needed to highlight a 30 second parts as a 'short'. Time will tell.

BeEqual Badge

I'm very interested in diversity and inclusion. IBM has historically been a leader in this area:

  • 1899: IBM hired three women—Emma Manske, Nettie Moore, and Lilly Philp—20 years before women were given the right to vote

  • 1899: IBM hired Richard MacGregor, their first black employee, 10 years before the founding of the NAACP and 36 years after the Emancipation Proclamation.

  • 1914: IBM hired its first employee with a disability, 76 years before the Americans with Disabilities Act.

  • 1934: IBM hired its first professional woman, 29 years before the Equal Pay Act.

  • 1953: IBM wrote its first Equal Opportunity Policy that called for equal opportunity in hiring regardless of race, color, or creed.

I am currently an LGBT+ ally, as I have personal experience of two of my (adult) kids who are transgender and very much aware of some of the issues they face in the workplace.

But I want to do more. So I am an initiative at work called Be Equal.

Be Equal is an invitation to engage IBMers, customers and society at large in promoting the advancement of fairness and equality in business and society.

I want to help promote the kind of workplace that anyone, regardless, would feel comfortable working in.

This week I'm about to start filming a lecture called “Future Trends in AI” as part of the IBM AI Academy. I've not done anything like this before. The materials have already been prepared, I am just going to be presenting them.

Talks / Streams

I've done two talks this week with my colleague Margriet. On Monday I hosted her weekly Lunch and Learn data science session, and then today she joined me on my Twitch stream to look through some crime data to see if we could spot signs of biases in the data. I really enjoyed co-presenting with Margriet as you get a much more natural dialog with a co-presented rather than presenting solo. So hopefully will be doing more in that format in the future.

Mechanical Keyboard

I have decided I need to get an external keyboard. I currently just type on my Macbook's built-in keyboard. But I'm aware I'm getting some pains in my wrist. I've had them before, but they seem to be getting worse. Despite owning one of the original IBM Natural Keyboards like this one...

...I've always stuck to Apple low-travel keyboard on the view that less travel must be good right? But it has only really clicked with me this weekend why mechanical key-switch keyboards are better. They actuate before they actually 'bottom out' and so you don't get the 'shock' travelling back up your fingers you get with Apple keyboards.

I also really want to go back to the split/angled style keyboard as it is more comfortable keeping my wrists straight. I type at a weird oblique angle on my Macbook in a way to emulate this, but it just means my fingers have to move in odd ways to hit keys.

Doing a lot of reading around, I think what I want is something like the ErgoDox EZ, but it is 1) very expensive 2) only in the US, so would need to be shipped over, and hence incur customs fees. See point 1.

But I've seen a few on eBay here in the UK, so that might solve that problem. I was hoping for one I could buy at a major retailer as I still have more BluePoints to spend. I was awarded a “Developer Hero” award for work I had done on my Twitch stream. But I might just have to go for this on eBay regardless.

Blog Post Reader

So another idea for a side project. Have I mentioned this before? I don't think so.


It came about from someone on Twitter saying they were looking for an app to read out web pages, so they could keep up with Coil posts whilst on the road driving.

So my idea is this: A tool that monitors the Coil blogs you follow and automatically uses IBM Watson Text to Speech service to 'read' the blog posts out and create a podcast format instead. So you could point your podcast app towards a specific URL and it would create a custom podcast from the Coil posts you subscribe to. Even better would be if it could do the Coil payments end to end. I'm hoping to find a way (or work with) the Puma web browser to do this. It would be a great 'value add' for Coil posts, that Coil subscribers could listen to posts read out. There is just so much good content out there, just hard to keep up with it all.

Shame I didn't think of this before, as would have been a great project to submit for funding from Grant for the Web. Maybe they will open another round soon?

PayID and UBRI Conference and SWELL

This week has seen the joint PayID and Ripple UBRI conference going on. There has been so much good content in it. There is a playlist on Youtube with each day in. The first video is private for some reason, but you can scroll through to the others.


and starting tomorrow is SWELL, Ripple's annual conference. This is main aimed at the finance industry, but I've been invited along, so will be interested to see some of the talks.

Roasting Coffee

Also, I created a brief video about roasting coffee at home:


And blog post about it: https://coil.com/p/hammertoe/Roasting-Coffee-at-Home/jjmCwQ1C5

Right... well I think that is it... the Call for Code awards starts in 15 minutes!