Back in 2015 Drake Music launched their DM Lab project in Manchester. The aim was to nuture a community of hackers/makers/coders interested in developing new musical instruments, ideas and technology to remove barriers experienced by disabled musicians.
As a sound artist, I was intrigued by the proposition of meeting like-minded people, so I went along to the first few sessions at Madlab and enjoyed meeting the group. The sessions continued on a monthly basis, and over time we all got to know each other a bit better.
Two years later, DM scored some funding to develop the project further. They launched a commission opportunity, dubbed the DM Lab North West Challenge, designed to stimulate further cross-pollination of disabled musicians and the hacking/making community.
4 commissions of £700 were up for grabs. These would be awarded to 4 new teams to create a new accessible instruments or tech. Each team had to be made up of at least one disabled musician and one hacker/coder/maker. The teams were tasked with coming up with an idea, and then a proposal to submit to the team at Drake Music. 4 winning teams would be commissioned.
Following on from conversations I’d had in the group DM Lab sessions, a team grew around me (Lewis Sykes – technologist/musician, Mike Cook – electronics expert, James Medd – musician/educator and technical coordinator at Eagle Labs and Craig Howlett – sound engineer) and the idea for the Nashesizer emerged.
For a while I had been frustrated by the digital audio workstations I had been using due to their inaccessibility. I have cerebral palsy, and this means fiddly mouse movements, swipes, taps and other commons gestural control actions can be a barrier for me.
For normal computer use, I use a trackball instead of a mouse and was hoping that for the DM Lab Challenge we could develop a MIDI controller/ DAW interface that may, for example, feature a trackball, as well as other more accessible controllers, so I can move freely, and quickly around the screen. No one wants their creativity hampered by clunky equipment.
This marked the beginning of a fascinating journey for myself and the team. In the first instance, we put the proposal into Drake Music and won one of the 4 commissions of £700. The only catch was we only had 4 weeks to make it!
An early version of the Nashesizer was showcased at a public launch for the DM Lab Challenge at the International Anthony Burgess Foundation.
The response we got at this event was fantastic. We garnered a number of questions from audience members that gave us pause for thought, and helped us to understand the potential the Nashesizer might have for other producers with similar barriers. Not long after this, I put an application in to Sound and Music, and successfully scored a further £5k to develop our idea.
Prior to my involvement in the Nashesizer project, I had favoured Studio One as a DAW, but for a while I’d been wanting to explore Ableton Live, and so it is primarily Ableton that the Nashesizer has been designed to interface with.
Our first proof of concept, therefore, featured a joystick that could be moved to select tracks in Ableton and move around the screen easily. There is also a rotary encoder (chunky dial) that can be used, for example, to increase/decrease volume, pan left to right and increase/decrease signals to sends when using effects such as reverb.
Both these ‘knobs’ have been designed using the 3D printers that were kindly made available to us by the DM Lab group’s current home Eagle Labs, Salford.
Eagle Labs have been an important part of this story. They are an organisation, sponsored by Barclays Bank, with a commitment to ‘fostering innovation and facilitating inclusive, shared growth for all across our communities’.
James Medd, who is on the Nashesizer team, runs the community space at Eagle Labs used by DM Lab, and has been a very positive influence on the project so far.
In addition to the joystick and rotary, we are also aiming to give the Nashesizer some form of gestural controller i.e. a sensor that would respond to hovering hand gestures, rather than touch. We have also talked about a touch screen, although this would have to be large to work for me.
The grant from Sound and Music has allowed us to purchase much needed materials to test, and the process is very much one of trial and error. Each iteration will take time for me to experiment with and report back on.
Stayed tuned for more developments soon!