While at Ensoniq, I worked out an algorithm for inverse fitting a 13-band graphic equalizer to the hearing loss of a near-deaf patient. The equalizer hardware is diagrammed below. It is worn inside the ear canal. The original algorithm by Paul McLean worked fine for most people. But when hearing loss was severe, the algorithm tended to peg the gains to maximum. To solve this particular problem, I had to redesign the fitting algorithm.
I formulated the problem as a projection on a thirteen-dimensional object in Euclidean space. The maximum gains bounded the object thus creating a polyhedron. The algorithm I created found the closest edge (in the sense of a 1-norm) and traced it to a point of minimum distance from the point of desired response that existed, of course, exterior to the polyhedron. Essentially I reinvented what is known as Linear Programming about 25 years after it was published by Dantzig. Ironically, I had a copy of Dantzig's book in my possession at the time that I purchased because I thought it was about programming computers; otherwise I had no idea what it was about. It wasn't until I studied the subject with Stephen Boyd at Stanford that I learned about Dantzig's work, and finally understood the heritage.
Here is a real early paper I began on the subject that remains unfinished:
It was written on Steve Jobs' NeXT computer (that I love), but the word processor I used is obsolete, so the paper remains highly unedited with arcane terminology that I invented, typos, and other assorted grammatical errors. Nonetheless, the computer programs attached to it are highly developed, debugged, and in a pinnacle state of completion; they stand as testament to the existence and success of this device.