Tuesday, September 5, 2017

Wix toolkit finance

To bitch or not to bitch, that is the question...

Hmmm where to start?

Half a decade ago, microsoft essentially abandoned setup tools... for their own operating system.  There was a bit of a zombie effort in Visual Studio with the old installer tools being included then dropped then kind of returned... but it was walking dead.  The politics were against it and they really seemed to want people to move on and only create software for their new app store thing.  So why not kill off another of the essential tools that real people depened on to support their customers?

Anyway,  see the history of my rants on this blog if you are into badly written screeds of bile about how that has poisoned my projects.

Fastforward... abit.  The anointed successor has been the WiX toolset.  Essentially, microsoft have dumped the responsibility onto the open source community to support one of their core platform tools.  Be that as it may... in half a decade, the WiX folk have produced a toolset of command line tools that work.  It handles the insane complexity of the task and integrates with the insane number of toolsets for building windows apps... and the insane number of platform changes that microsoft keep vomiting forth... all for gratis. 

Now, yet again, I am trying to work with WiX and yet again, I have run into the problem that existed every other time I have had the mis-fortune to run into it.... documentation.

It's monumental.  (as in a monument to something that you really wish happened to someone else).  Everything you need to know about the WiX toolset is in there... somewhere but there is no coherency to the knowledge it contains.  You need to self assemble that.

Consider this,  for the best part of a decade, the WiX folk have been hammering away at this problem and their best efforts is documentation of the toolset and schema, one pretty sketchy tutorial, stack overflow discussions and three books.   

Books are an increasingly pointless solution to the problem of knowledge about technical systems.  They are increasingly difficult to publish, useless to update, the quality of publications is falling and the trust that they contain the knowledge you are seeking is falling.  Technical publishers have done this to themselves, but its an emergent phenomenon of the speed of change vs the latency of publishing.  As such, books seem less and less useful.

I think we have passed the point where a static book is the best technology we have for communication of knowledge.  I think this inflection point occurred about 7-10 years ago.  Probably about the time I stopped buying technical books seriously.  I certainly purchased a lot after that... but it was a tapering off period that I didn't yet understand.  Now I look at the prospect of buying and using a book with distaste.  I'm seriously considering disposing of my collection of technical books because of how little use they are to me.  Everything is dated, many are out of date and none are searchable or accessible in the way I currently seek knowledge.   

Am I sad about this?  Perhaps.  Certainly a bit nostalgic,  but the replacements for books are just so much better.  Online, dynamic knowledge bases are simply better for most of the day to day question and answer stuff.  The one thing they usually fail at is what I call, curated knowledge. 

This is very much the core business of academics.  The collection, organisation, dissemination and curation of knowledge about a specific topic.  Where this is done well using accessible digital tools, its brilliant.  The ability to access, update, search, cross reference, annotate and version knowledge in a good system like a wiki or a well structured knowledge base is brilliant.  But its still a massive job. 

I have helped build a number of knowledge bases.  None of them has been simple and none have been financially rewarding.  They are run on passion.  Wikipedia has my respect but I also see its flaws. Good writers can't eat passion. 

I think the problem winds back to the lack of viable mechanisms to identify and reward high quality documentation as part of open source (or any source) tooling. 

I see that FireGiant has started trying to sell a companion product to the WiX toolset. Will this help with the documentation? Kind of burns when you're used to the "free" access model... but the reality is that free is a pretty expensive service to provide.

I know this has been said any number of times before... but there has to be a better way.

Having stated the problem...again.  What's the possible solutions? 

Patreon? Kickstarter? Micropayments? Freemium? Ad-supported? Slave writers?  All of the above?

Everyone seems to be re-inventing the same wheel for similar problems. 




Wednesday, August 23, 2017

Automation in Software development tools

Just reading and kicking around ideas about the impact of AI on software development.

The ideas I am synthesising are:

An article on creating occular prosthesis I saw on tv this morning,

This article on Brute force Proofs for Math problems:
https://motherboard.vice.com/en_us/article/padnvm/200-terabyte-proof-demonstrates-the-potential-of-brute-force-math

And an article on programmers as craftsmen that I read somewhere back in the past:
Might have been this one: http://manifesto.softwarecraftsmanship.org/


I think that there is a cusp point at which an industry transitions from human skill powered to automation.  This could be pure mechanical automation/reproduction or in future AI driven systems.  This has happened to any and all manufacturing over the past century.  Start with Armoury practice or the "American Method" in weapons manufacturing.  This was a simple move from craftsman driven industry to an industry built around a component based design where each component was produced by a specialist.  Then the specialist was replaced by a cheaper duplication process.  This could involve humans, but it was reduced to a series of steps that could be done on machines.   The craftsman who once had to know all the different aspects of the production process for a weapon, then faded into irrelevance.  (Until the process was re-discovered and turned into a you-tube series)

Why did this happen?  Economics?  Yes, the production line is more "efficient" at producing a volume of similar items.  There are also emergent phenomena that this production model created that were not possible in the craftsman model.

The invention of the production line model allowed many people to be part of the production that were not previously able to be a "one stop shop" craftsman.  People could be good at woodwork but not metal work.  They could be specialist at creating screws or making barrels.  They did not have to be masters of making barrels, triggers, stocks etc.  By decomposing the whole item into components, it allows more people to specialist.  Some of these specialties were boring and repetitive and would not be "enough" to keep a person who was a "total" craftsman interested, but it opened up an opportunity for people with varied ability and shorter educations who could then be part of a production line for high quality items. It also allows deeper "specialization" to occur.  A person who spends all day producing one type of thing, has the opportunity to get very good at it. If they produce many similar things, such as screws or triggers,  they can understand them at a level that is not availible to a craftsman who is trying to be good at all the areas of the design and the skills and tools required to produce it.  I call this the "specialization limit".

The second emergent phenomena is the scale of the item design.  There are only so many skills, techniques and materials that any one craftsman could learn and invest in.  This is what I am calling the "scale limits".  But a production line can bring the skills of many specialists to a much larger design.  Imagine trying to build a jet engine using the craftsman method.  One person learns the skills required to operate a huge machine shop and produces all the parts and components to construct one massive engine.  Its possible but it would be a singularly unique person who would pursue the education, experience and focus required to craft this kind of thing with the precision required. 

There are lots of other phenomena, such as replacement of people on the production line.  Even though we do not like the thought of being replaced,  its easier to replace and train a person on a single component production, than it is to replace a craftsman who is doing a whole bunch of different skills and processes.  This is often the work of a lifetime.

One more phenomenon that I want to make a point of:  The phenomenon of the product complexity growing beyond one persons capacity to either design or make.  If you look at the growth of complexity from a craftsman made flintlock rifle, through to something like the M1 Garand, I think the craftsman who made the flintlock could have looked at the Garand and "seen" how it worked and given some time and modern tools, probably made a reasonable copy.... but there is a point where the evolution complexity of a family of products exceeds most craftspeople.  Weapons are probably not a good subject to try to make this point as the complexity has not started to grow exponentially.  The most complex weapons I can think of off the top of my head could probably still be taken apart and rebuilt by a very competent modern fitter/machinist/electrical engineer... basically because they still do all the repair and service on these systems.  Perhaps I should call this the "serviceability limit" (since I'm making shit up...).   The point being, that without the servicability limit, the complexity of a system can easily grow to exceed any one persons or teams capacity.  Look at operating systems if you want an example.  They have evolved way beyond the capacity of anyone to service them.

With modern tooling and processes, there are less and less roles for humans in the production line.  Only in factories where they are doing small run or are too poor to afford robots do we still find people doing lots of the roles in production.  We currently have the robotic technology to replace just about everyone in a production line, however, there are still a range of "hard" bits that no one has bothered to automate.  If you look at a current generation vehicle assembly plant (watch any of the "mega-factory" type documentaries,  any of them could be fully automated.  However, its currently cheaper to use humans to do the "hard" bits than it is to finally automate the whole thing.  They main role for humans is still the "creative" bit.  Where the product is designed and problems (created by the human users, service agents and consumers of the product) are solved.  The funny thing is that most of these creative solutions are pretty common and could be automated in part or in full already.  Think about a car assembly plant.  The design of a car is not that tricky.  They are all essentially variations on a theme.  The main variance is the "problem" that they are solving.  "I need a small car to run around town and take the kids to school".  Not that hard to solve apparently.  The whole design could be automated with enough effort, and we could produce thousands of variants of the "small car" design.  (With 3D printing of components this has got to be even closer. I'm waiting for the kiosk where I can order a custom car and it will be delivered from some mega-factory that is essentially just a giant car printer and assembler robot. )

The last phenomena is what I call the "forgetting of skills".  Once a skill set no longer has a commercial purpose, it becomes a "hobby" at best or is simply forgotten by the majority of people.  When I went to school, woodwork and metalwork were still part of the curriculum.  Even though there were virtually no manufacturing jobs in the town that would have employed me with them.  There is still carpentry in building houses and lots of small fabrication shops doing repairs for all sorts of equipment... but the reality is that these are more foundational skills or hobby skills.  I really enjoy wood and metal work but I respect that they are less and less involved in the economy around me.  I have to face the fact that I should not encourage my children to study these subjects at school as they are probably irrelevant to them being able to survive.  My point is, once a skill set, such as crafting a flintlock rifle is no longer economically valuable,  everyone moves on, and the unique set of skills and knowledge that was previously encoded in a single group of craftspeople dies out. 


Anyway, to bring this back to my point....

Software development easily falls into the craftsman model at the moment.  Even on the big teams where they have compartmentalized design, programming, testing, deployment etc... its only the first step away from "one programmer shops" producing application packages. (Yes, I 'm a one programmer shop and I'm in the dark ages.... and often feel far from being a craftsman... different rant)

My point is, that there are components (clearly I use SDKs and UI controls and frameworks etc) that mean I am not re-inventing the wheel, but each of them has been hand crafted by one or more people.  There is very little that could be described as automated in software development.  I have used a couple of code generators (shout out to XSD xml parser generator) that are great at doing one thing well.  These are the future.   Reminds me of an automated UI generator that turned up in a news feed a month or two ago.  (Can't remember the reference at the moment)  but that is the way of the future. I think we are the last generation of craftspeople programmers.

I think there will be a point not too far in the future where round trip design/programming/compiling/services systems take over software production.  Humans will cease to be up to their elbows in every line of code and we will at least be able to describe the interface for a module and any side effects and not only have it written auto magically, but it proven to be correct.  This may include a suite of unit tests and any other tests, but these are human artifacts to test for human errors.  I think the tests will move to a higher level of abstraction to prove that the design layer is robust, as that is where the creativity will remain for a while.  Once the AI is suitable to replace the creative limit of the humans, then the whole process will be automated and software will be a dead art from the olden-days.

So, lets recap:

The "Specialization limit", the "serviceability limit" and the "scale limit".  These are all human limits imposed on software systems.

I would argue that the "servicability limit" is very loose in software, as a great deal of it is both blackboxed and so poorly debugged that its hard to argue that any of it is servicable. (Even the stuff that I try to write) Once its compiled and running in the wild on operating systems that have been updated beyond what it was tested on... all bets are off.  There is virtually nothing the user can do to "service" the item.  It either works or it doesn't.

My ability to service it is still viable but getting harder with the proliferation of platforms.  I am working on an old C++ application at the moment.  It has fairly clearly defined platform targets that evolve relatively slowly (I say relatively... whole other rant) in comparison with a javascript app that I was working on previously that is just a madhouse of platform variations and patches and just...

At a certain point, it exceeds any one persons ability to service this kind of product.  I have a fighting chance as the original designer and builder, but for many legacy systems, the complexity rapidly exceeds the skills of anyone who inherits the system.  Keeping up to date with platform movements and maintaining a heritage skill set and knowledge base to understand legacy codebases gets pretty ugly.  Just in my own work, I am now looking backward over more than a decade of development using a slew of languages, frameworks and sdks, along with a whole pile of bad design choices and trying to decide if its worth trying to modernise the program. (not that it had a large user base but it represents a big chunk of my life that I am not sure I want to say goodbye to yet...)

The "specialisation limit" is also starting to kick in.  I actively work in a range of languages across a few platforms and there is a point beyond which I cannot work in any more.  Trying to pick up something that I used to be good at and get back into it, takes some work.  I notice the cost.  Maybe it would have been easier when I was young and my head was empty, but its filled with all sorts of knowledge that is no longer relevant for all sorts of systems that no one cares about anymore.  (I think I need to burn my bookshelf.... )
Watching the library throw out books that I have read is a pretty shocking thing.... especially when you look inside the cover and see that it was checked out exactly 5 times in its lifespan.  But the point is, these skill sets are dead.  No one cares anymore.  They are not economically valuable and my children will never learn them.

The "scale limit" is an interesting issue with software, in that without automated tools, there is a point beyond which we cannot go with a codebase.  Its just too complex to comprehend in its entirety.  (This is without taking into account the hidden complexity of all the SDK's and the platform code that its running on...) Some of the round trip type tools allow you to deal with larger amounts of complexity, but at a certain point, it will exceed your limits, simply because we all have limits.  Automated systems are theoretically limited but the reality is that they can run with much larger capacity than any single human... and at that point we are irrelevant. We will no longer be ecconomically viable.

Programing will be a hobby at best. Our children will order software from automated kiosks that present customized packages assembled by software robots. The company with the best robots will win.

It's interesting to look at the evolution of machine tools along with the evolution of the production line.  Mechanical tools, Electric tools, CNC tools, lazer tools, 3D printers, design software, welding robots etc.

Looking at a woodworking shop (cause I can watch youtube..) you can contrast hand tools, planes, chisels, saws etc. with power tools, power plane, thicknesses, joiner, mortiser, biscuit joiner, table saw etc. The advantage of the power tools is speed and reproduce-ability.  The material is the same and the construction is the same, just faster.
Once you move up to CNC driven cutters and routers or 3D printers, you are no longer working the same material or construction methods.  Essentially, the materials have had to evolve to suit the tools, and the products that are produced are no longer the same.  Take a wooden box made with traditional techniques vs a wooden box made of MDF cut out on a CNC router, assembled by a robot and finished in a spray booth.  The first has been crafted from peices of natural wood, with grain and form.  Finished in a pleasing way by the maker.  If the maker used hand tools, it may be a little uneven or not, the difference between handtools used with care and power tools is not that great. The difference between that product and the one produced by the cutting robot however is more distinct.

Both are functionally similar, in that they are a box, but the second is really defined by the process and materials. This is because the production line for robotic wooded products is still only in about the second or third generation.  The next generation will be robotics that works with natural materials and reproduces a wooden box that appears to be made by a craftsman.  Initially there will be all sorts of cheats to make it easier for the robot to handle the components and to cut the joints... but these are not hard to overcome.  It's quite easy to visualize a production line of robots manufacturing wooden boxes that are not easy to differentiate from one made by a human craftsman.

All these tools are working in the production space.  None have yet moved into the "solving a problem" meta space.  A robot has not yet been developed that can identify the need for a wooden box, build it (or commission it) and supply it/install it and then finally service it through its functional life.  This would be a vertically integrated system. 

If we looked at software developers tools... its hard to see that we are really that far along.

Text editors are pretty nice, but once you start to list all the "power tools" that can be used, the list is quiet short and is only implemented in a few languages.

  • Indenting / formatting tools - pretty print the file, de-mimify
  • Spell checkers - syntax checkers and variable/function name checkers
  • Cross referencing tools - quickly jump between files, declarations, definitions etc
  • Re-factoring tools - lots

In the rest of the toolchain, we have testing tools, compliers, linkers, packaging macros, build tools, installer builders etc. Let's not belabor the point... 

Every one of these is still just duplicating what you could do by hand.  (Yeah, I realize that compiling a program by hand is beyond me...) but that's the point.  These tools are simply doing what is humanly possible... faster.  Essentially, "powertools" in the above woodwork analogy. None of them is doing anything not humanly possible... the materials are still the same.  There is little integration between the systems (except the compiler and linker... again not the point)

I think the next step will be when the materials are modified to be more machine friendly than human friendly.  Looking at all the "intermediate code" and java bytecode that is around, you can see the process is clearly in process.  While we could kind of work with bytecode, its not economically viable.  There is little point in trying and so the skills are already being forgotten (or not taught to the next generation).  I find it hard to imagine that I would encourage my kids to learn assembler.  It's just not a good use of their time.

When the software design tools do not bother to produce human readable code, even as an intermediate step, programmers are irrelevant.  When the design tools take a rough problem description, then software architects are irrelevant.  When bugs are automatically removed, then software serviceability will be solved.

I think that this will happen when we have a high enough level language to describe a problem and its boundaries and a compiler/software robot can translate that into a solution and customize it to our desire.

With the current crop of AI, I think the question is whether this will be a human readable language or simply a pattern recognition neural net that can interpret our vague problem definitions and produce a software solution on demand.  At which point, there will be little point in evolving operating systems by human hands anymore.  Machines do not need any of the graphical user interface crap that has bloated most of our OS's. 

Anyway, the vertical integration process will continue.

It's interesting that consumer goods like the above "wooden box" example are still so far from being vertically integrated and human free.

There is no robot or chain of robots that can harvest a wild grown tree, mill it, dry it, cut, fit and assemble it from real wood into a box and deliver it to me.  Note, wild grown tree and real wood.

There are sections of that production chain that exist and have been integrated, but they are still little autonomous units that are struggling to connect. 

There are a few more production lines that are able to consume wood chips, turn them into MDF/plywood etc. and cut, assemble and package for delivery, ship it and deliver it... but sill not even close to vertically integrated if you look at the "needs" end of the chain.

For me vertically integrated would be to start with a design (or brain fart) from me in a natural language including the type of wood, figure, finish etc (or even have that automagically recommended (not hard based on my preferences)) all the customization with my name, personal carving styles, size, inserts etc, go out, source the materials (cut on demand or stockpiled by robots) manufacturing the box, pack it, ship it to me and present it in a useful time frame. 

Bringing all that together will be the work of the next few generations.  I can see it happening with engineered materials in the production section of the chain.  But the problem definition is still rudimentary with our current natural language processing, the automated design should be pretty straght forward, but isnt yet.  The packaging, shipping and delivery is still at the whim of retailers who are struggling to produce a pair of customised pants.  Once you introduce the additional problems of non-engineered materials, its not hugely more complex, but the robotics need to be a lot better.

It would be awesome to see a tree farm managed by robots.  Maybe not "managed" but certainly operated.... I predict in the next 20 years. 

Having finished this brain fart I then read... https://www.microsoft.com/en-us/research/blog/program-repairs-programs-achieve-78-3-percent-precision-automated-program-repair/


Tuesday, May 24, 2016

Useless Musicians


Musicians and Composers are the poets of our age. They give us the words to say when we cannot think of them ourselves....

Ok, so I may be paraphrasing a little but the sentiment makes enough sense for me to construct my argument around.


Consider this,  for all the major events in your life that you want to be able to express your feelings about, how many songs can you think of that are appropriate or even close?  Let me throw up a few situations and see if you can think of something.

Falling in love?

Getting your heart broken?

Falling in lust? 

Dancing?

Birthday? 

Being away from someone that you love?

Party time?

Ok,these were pretty easy scenarios. Songs about these kind of situations are pretty common. Now try a couple of more challenging emotional situations that might choak up your flow of expression.


Getting a new(better) job?

Getting out of hospital?

Finding out you're pregnant?

Getting evicted from your family home?

Getting bullied at school?


Getting a bit harder to think of examples of relevant music and lyrics?  These are actually pretty common emotional events in life and could happen to "normal" people. Once you start looking at even more frequent events... the number of songs gets even smaller.

Getting arrested?

Paying off a debit/mortgage/loan?

Giving birth?

Burying your parents/partner/child?

Learning someone close to you is missing?

Getting diagnosed with something horrible?

Once you move on to some of the big issues in life, the number of songs gets really small. 

Racism?

Systemic poverty?

Environmental destruction?

Hunger?

Inequality?

Greed?

Mental illness?

Governmental Corruption?

Slavery?

False imprisonment?

Abduction?

Entrapment?

People smuggling?

Whistleblowing?

The point I want to make is that I'm finding it hard to find much in the way of music and lyrics that provides any expression on much beyond the teen mating rituals. It would be nice to see musicians tackle a few of the bigger issues in life.



Monday, April 27, 2015

Pointless philosophical debate - Disengagment, Debugging and Futility


I listened to an Honours presentation a few years ago called:  "Knowing When To Quit: Self-Focused Attention Enables Disengagement, Positive Affect Predicts Reengagement Tendencies. (Tristan Hazelwood)".

It was on one of those aspects of psychology that I had never considered before.  That of "task disengagement"... or the ability to quit doing something. I found this quite personally intriguing becasue I had always seen "quitting" as a failure state and even though I have quit a lot of things in my life... it was always with this sense of personal failure.

Tristans experiment looked at how long people will continue to attend to a "futile" task before stopping doing it.  There were some manipulations as well... which are not relevant.

I had never really asked myself this question or the extension questions.

How long do I keep doing futile tasks? 
How do I know a task is futile?
What if a task seems futile but is ultimatly not?
What if the effort required to complete the task is disproportionate in comparison to the reward?

Anyway,  this gets back to my today issue:  Fixing bugs in other peoples code. 

I have just spent way too long trying to debug why the wrong favicon was being displayed in a particular browser (Firefox).  This required me to:
  • Verify my assumptions about the local copy of the site files (Still broken)
  • Verify my assumptions about the server copy of the site files (Still broken)
  • Reload everything in Firefox
  • Test in multiple other browsers
  • Read up on how Firefox handles FavIcons
  • Break into the sqllite db used by Firefox to cache the favicons
  • Clear the cache, reload, test, clear again, clear a different way... test-fiddle cycle until I got the combination correct and finally expunged the old favicon from the chain of server-browser-cache rubbish.

In hindsight, knowing what I know now, this was a worthless task.  It was a combination of a stupid caching policy/bug in firefox and my uploading "placeholder" images before I had the final design for the favicons. The problem would not have been visible to the client because it was in my local browser cache.  But at the time all I could see was an error!

When is a debugging task "worthwhile"?

I have a long history of tenacity with debugging... and in hindsight many of the more intractable bugs now seem to be examples of time wasted and needless distractions.  Others were important and meaningful battles that I needed to fight to get projects over the line. All were bugs.... are there any clues to differentiate between valuable battles and needless distractions?

When the bug is in someone elses code... 


I think the first possible clue is where the bug lies. I think debugging is often a loosing proposition, even if you do pin it down.  I have spent way too long proving, documenting and reporting bugs in products that I use in the hope that they will be fixed.  Some have been... but I think that the majority of them resulted in a discussion with the product programmer/support minion that not only wasted more time, but resulted in them not resolving the bug in any way that resulted in me getting a fix when I needed it. (I.e very soon after identifying it)
I would suggest that while these activities were probably noble they were essentially of no value to me or the work I was trying to get done. (Except where they resulted in me clearly identifying that the bug was not in my code)

My point?  Stop debugging other peoples code where possible.  There is just not enough time in my life to fix all the broken software (and get a usable result) Once I see that the bug is not in my code... give up.  Find a work-around, hack it, abandon the tool/framework/etc... but move on.

When there is more than one way to acheive a result...


The difference between "the right way to code" and the "expeditious way to code" can sometimes be clear.... however, I find there are usually lots of ways to get to the same result.  Unfortunatly, I often find that the "recomended" way in the docs/tutorials/book/forum post/back-of-the-toilet-door is either out-of-date, incomplete, partially thought out or just plain wrong.  The time I have spent trying to get stuff working that I found on the internet... that eventually turned out to be rubbish, is painful to recount.  (This is mostly my own fault for trying to work in too many languages/toolkits/frameworks etc without taking the time to be an expert first)
I think that I need to be a bit more willing to abandon broken code a bit faster than I currently am. This often comes back to some time pressure, that I'm trying to get something working quickly rather than taking a bit more time to learn the docs first.  False ecconomy.

That being said, I would say that more than 50% of the time when I have gone to the docs for whatever I'm working on, the docs are wrong in part or whole.  So, I'm not sure if there is any actual win with reading the docs first... but it might help in some cases.   To paraphrase... go back to the source Neo...

My advice?  Don't hold onto any ideas of the right/clean/ideologically sound/best practice way of doing something.  Get it working as best you can and clean it up later if it proves to be a problem.  Future proofing is a never ending piece of string argument.... you cannot win, so don't fight the battle.

Usually I find that the next time I try to solve the same problem, I inevitably have a different approch anyway... so I tend to end up rewriting rather than fixing... but thats me.  The key point is to let go of something that is not working no matter how ideologically "right" it may be.

The curiosity trap...


Debugging to figure out how something works is a really painful way to learn.  There are probably lots of scenarios where its the only way (security work, reverse engineering malware, deep bugs in critical systems etc)  but for most of the desktop stuff I do, there is just no reason to be trying to learn that far down the levels of abstraction. It's just too slow.

My point?  I think when its curiosity driving the activity and you find yourself trolling through code... "you're doing it wrong".  Go feed your brain on wikipedia or get a pet or play Sudoku.... the urge to solve puzzles can lead programmers into some really pointless time wasting.

Neat and tidy.... 


The urge to "finish" something or make it "tidy" is a seductive beasty!  It can lead you to make beautiful things... or take you to crazy places.

I find that neatness in coding is important so you can stay on top of the project... but there is a slippery slope beyond that point where neatness for the sake of it becomes a procrastination exercise (he types on a blog....laughing ironically as the keystrokes land...)

Basically, there is no limit to how far you can take neatness in coding.... I think the best advice is to go the other way... encourage messiness to the point you cannot function... then take a tiny step back. Minimize how much extra time you need to spend on the housekeeping...

This ties in with our human pattern recognition systems so it can be a bit of a two edged sword ( triple edged swords just mess up this metaphore...so just don't...)  in that it can be very valuable but also lead to the darkside.

Neatness in the code allows us to scan repeating structures in ways that don't involve fully concious "thought".  This can be a really valuable behaviour; that personnally has spotted issues more times than I could guess.  However... the other side of this problem is that it only works with repetition.  To get maximum value, you need repetative patterns.  Once these patterns get larger... you start to run into the bad 'code smell' of  DRY.  (Don't repeat yourself).   So, I think its probably best used for "small stuff". Such as how I order operators or how I use whitespace in a statement.  The ways that brackets are laid out (this may be why code formatting is such a universal issue for code jockeys)... but once I start to see repeating blocks of statements... its time to refactor or shop for a red lightsaber. 

I think there is a bit more to think about in this area of pattern recognition in coding, but thats another post for another day.

In summation your honour... 


Estimating is always easier in hindsight.  Knowing when to let go of a task is an np hard problem.  Having written all these ideas down, its still not clear that there is any generalisable wisdom in any of it.  But, I feel a bit clearer having articulated some of the issues, which is really the point of writing it down....








Friday, April 17, 2015

Automated Passenger Aircraft

In light of the Germanwings crash, I have just seen proposals for remote control of passenger aircraft.   Almost choaked with laughter (at the solution... not the horrible problem).

Lol.... Called it!

See my random rant from 2013... http://stratasphere.blogspot.com.au/2013/04/plane-hacking-or-not.html

Ok, work with me here. 

1)  Plane with a single pilot who goes bad can crash/hijack/fly into towers etc..  

2) Plane has two pilots (captain and co)... one goes bad.  The other is supposed to either take control (physically... 50/50 chance I guess.. unless the bad one did some forward planning) .... same result in some cases.

3)  Plane has two pilots who collude and go bad together.....

4) Plane has one remote pilot.... who goes bad.  Result... whatever the pilot wants.

5) Plane has multiple remote pilots who collude and go bad together....

6) Plane has one pilot and one remote pilot.... they disagree.

7).... other permutations


These  are all examples of the two clocks problem... which is essentially a trust system.  I have had to explain this to so many students I can do it in my sleep now. Not sure where I first heard it but it was a while ago.  I have repeated it and mangled it so much that its probably not even recognisable.  However it is very useful for instrument design and trust in black boxes.

My version of the two clocks problem

So, on oldschool sailing ships they used to calculate their map position using clocks as a reference.  The captain would wind the clocks every day and by plotting angle and speed, against the clocks time, he could fairly well calculate his position on the ocean.  However, clock making technology was not always wonderful.... which resulted in clocks running slow, fast or stopping. 

What do you do if you are out on the ocean and the clock fails?  You're lost!

So one solution was to have two clocks on board.  Wind them both at the same time, keep a log of any time differences, keep them in the same place so the conditions were the same etc.(Control your variables) 

What  if you wake up and the two clocks disagree? Which one is right?  (Which do you trust?)

So the solution is to have three clocks.   If at any point the clocks disagree... the odds are that only one will fail at a time, so the other two should agree and you can reset the bad one (and make some notes in the log about not to trust the shoddy clock).

The point being that in a trust game... you cannot differentiate between two conflicting but trusted positions.  How do you know which pilot is bad? 

So is the solution to have three pilots on each plane?  Remote pilots? Robot Pilots?  Which do you trust?

The problem is not that you cannot design a robust system... the problem is that a robust system will appear to be inefficient while its opperating.  The pressure to cut costs will always be a factor in free-market ecconomics... so any system with two redundant parts will eventually be simplified down to one.  Simply because nothing bad ever happened.....

Keep in mind that trust is a dynamic betwen components... its not a property of any single component.  The is the opposite of the profit principle... which says cost is a factor of each component and reducing cost is so easy....


Duncans first law of system design

An economist will always fuck up any given system they are given control over.  

Why?  Becasue their minds do not work right.  They suffer the human frailty of trying to simplify and generalise based on perceiving repeating patterns in their perception of the system.  This lets our meer moral brains make sense of overwhelming complexity.  This gives them the idea that they can "optimise" or get efficiencies... but remember that their perceived patterns are based on only the amount of observation/data they have access to rather than a complete mapping of all possible cases.


The secret of any trust system is not to prevent it getting into a conflict situation but designing for the inevitable undesirable cases and having an elegant way to get out of conflict.  (different to risk mitigation-which is an economists way of trying to cope with edge cases)

If the air saftey groups were not economists they would design a flight system that could be flown by a suicidal pilot safley.  But once they start thinking around that corner... they cannot be economists any more.

The economist mind set will always try to eliminate/replace/fix the "bad" component in the system and assume that everything else will remain the same.  This is such newtonian thinking.  The universe is not a giant clock.

Bad is very very very relative....

Imagine an aircraft flight system that could be flown by a healthy happy pilot, a suicidal pilot, a hijacker or a child.... all in total saftey for the passengers.  Once we crack that, we are probably ready to call aircraft "Safe". Is this the same as driverless cars?  Are there still situations where the skill of a "human" is our last hope?  (Probably given the state of hardware system design.... )

The point being that the passengers should be safe even when the system is in a failure state.  Why are there no "eject" seats?  Why no ability to seperate the passenger compartments/pods and fly/land them with shutes or whatever?  If you were designing another system with a massive single point of failure (pilots and cockpit) you might think about some sort of backup... but aircraft designers have some sort of catastrophic inability to learn from any other industry.....

Moving on....

Thursday, April 16, 2015

Thought Viruses

I have been doing some reading about "Narcissistic personality disorder" and "Borderline personality disorder" for various reasons.    While there are all sorts of aspects to these conditions, they are essentially a set of "thought patterns" which are expressed as a grab bag of symptoms of varying intensity by the victim. 

The key point being that these "though patterns" are "trained in" by an abusive parent(usually).  They are do not stem from a physical injury or anything else.  Essentially the child is fine before hand, then afterward is broken by exposure to the parents condition.  Similar to PTSD.  

The complex issue that I have recurringly thought about is that these particular conditions are repeated down the generations unless interupted during transmission.  The condition is self-replicating and self-maintaining.  I.e  a parent with a mild case of NPD can damage a child who manifests a strong case of NPD... so the condition does not "weaken" over generations.  

To me, this is a perfect example of a "Though Virus".    The same sort of pattern seems to happen with bullying (although I have not read as much about it)  where the victim of a bully may go on to become a bully themselves, thus replicating the condition. 


I'm sure there will be a bunch of these kind of "though patterns" that are transmitted from generation to generation. Some we call "wisdom", "habits", "myths", "family culture" etc.  But like all symbiotic relationships, its the negative ones that get called names.  


The interesting aspects of these constructs is that:

A) If they can be given to someone, they can be taken away. (More or less cured... in theory.  This ignores any damage done while the victim was carrying it; which could be substatial)
B) Can we develop an immunisation for this crap that will remove it from society?
c) Is there are liability for society by allowing parents to infect their children with this kind of negative thought pattner?  If so, should society identify and treat before the infection can jump to the next generation? 
d) These things can only work in a particular context.  NPD depends on isolating the victim(s) and creating and re-inforcing an alternative "reality bubble" around them.  Can this be defeated simply by not leaving infected parents alone with un-infected children? (This is probably the cure for all sorts of shitty parenting....)

e) These kinds of though patterns could potentially be transmitted to artificial intelligence and back again.  Something to think about... 


 Once you start looking at thought patterns as a transmittable "thing" you start to see all sorts of passive and active mechanisms that may be playing large or small roles in this process.  I have seen a few articles such as "Playing computer games is re-wiring my brain.." kind of thing.  I know logically this is true... I have just never considered the full extent of this kind of massed, repetative rewiring.  Its funny to see social conventions emerge and propogate on game message boards.... and then to see them make a jump into social memes... but in reality this is how society has evolved... a collective set of through patterns that are self sustaining, transmissible, and self-reinforcing.  Keep in mind that all though patterns are emergent (random, chaotic and useless until re-inforced by utility) its easy to think of "intelligence" as one big virus.....

Once you conceive of the human brain as simply a virus laden organ which can be infected by other viruses.... it gets weird. 

But as a mechanism to explain intelligence, thought, society etc... its pretty neat and tidy.
 

Wednesday, February 18, 2015

Why Solar power systems suck


I have just finished my yearly research into the current solar schemes and system prices, I have come to the conclusion that its a big scam and will continue to be for some time. 

Let me be clear that I am talking about urban, grid connected systems with easy availability of grid electricity.  This is not about remote area power or fridge power (where the quality of the grid is a bit wobbly)

1) Pricing

The pricing model for solar systems is always pitched in comparison to the grid.  The sales people are simply trying to "beat the grid"rather than pricing the systems based on any intrinsic quality of the system.  Usually they are only trying to be "just"better than the grid price.  The other problem is that the way they try to calculate that price is on an upfront cost versus "best guess over time"cost.  This all gets pretty wobbly once you start trying to guess what your usage will be in the future, what the grid price will be in the future etc.  

2) Power output

The models used to predict power output from a solar system are pretty simple.  Amount of sun per day, amount of sunny days for your region, potential output from the panels, decline in panel output over time, time of day you want to use the power.... but you try getting a straight answer from a salesman.  Once you do the math from first principles its not rocket science.  But you also realize just how little actual usable power you will have at any point in time. 

Then you realize just how much power you will be returning to the grid for free; which leads to the realization that you will be buying most of it back at 6 times what you were paid for it.

3) Storage

Storage is the only thing that makes a solar system make sense. But once you look at the additional cost and risk it just gets stupid quickly.

The storage systems that are available are expensive, high risk and high maintenance.  You require yet another box on the wall to manage the storage, which increases system complexity and risk.  Trying to do anything like a gravity battery or hydrogen system is a joke.  There is just nothing viable unless you go for old school battery banks with the associated problems. 

4) Risk

Take 15 different peices of equipment, wire, frames and a connection to the grid and figure out the failure rate of them and the cost to replace, downtime, potential side effects of collateral damage in the event of catastrophic failure, the additional risk of high voltages floating around the house and the general issues of dealing with more trades, small businesses and 15 to 25 year warranties from companies that change their names every two years or only supply under short term contracts....  It's just a massive pile of risks that are difficult to mitigate.  Some are physical risks, some are ecconomic, some are reliability risks.... and they are all your problem once you buy the system.

About the only way that you can mitigate some of these risks is to buy (or add the system to your existing) insurance.  However, as we are again playing guessing games with the future, I think its hard to know if the insurance will actually cover all these issues.  Even if it does now, its possible that this could change at some point during the period you own the system.... yet another risk.  

5) Investment return

Honestly the investment returns I have seen are rubbish.  Depending on how you massage the spreadsheet and how much wishful thinking you inject you can get a flat payoff period somewhere between 6 and 10 years.  IF you play the system very hard. 

If you live in the real world and work outside the home during the day... then you are pretty much screwed.  Unless you can either store or use the power during the day, your ability to recover your initial costs are seriously diminished. The power going into the grid during the day will not be nearly enough to offset what you use during the evening.  

This is yet another risk that is not disclosed in any of the literature that is easily available. Your system dictates your lifestyle.   

The return from the gov rebate or a grid feed rebate is always going to be a game run by the big players.  YOU CANNOT WIN.  There is no interest in making it a fair game. Even if it was, it would still be a game of SCALE.  Big generators will be able to get efficiencies that small players cannot.  The overheads will always push small producers out of volume markets. 

6) Reliable data

Just trying to get hard enough data to differentiate between two products in the solar market is way too hard. It's just bullshit.  You try differentiating between two inverters based on anything other than the published price and the colour of the box... you have virtually nothing to go on.  I have not found anything like an independent testing body any useful data to base a decision on performance.  Even then the sales people will finally admin that once the system is installed they will need to "tune"it to get it to perform adequately.  This may continue for a year or more.  (So for the first year or so of your systems lifetime, it may not work properly.  Should this be part of the product disclosure?  What effect will this have on your payback period?  Does this add more costs?  What is the risk that they will never get it working "adequately"?)   

7) Inflexibility

 Once you commit to purchasing the system and all the associated unknowns... you are stuck with it for the future.... If you have had a look at the resale value of an installed system... unless you sell the house with it installed its going to be alot smaller.  Keep in mind that you need a sparky to extract the system and a way to deliver it to the new owner.  Then you have the loss and liability involved in selling something. (Check consumer law) So, in summary there is no cheap way to change your mind without taking a bath.

Now think about the technology. The panels degrade over time and the inverter is a computer. The batteries do not get better with use.  Everything gets much less valuable the longer you own it.  At some point everything you have purchased will reach a zero value point and you will need to dispose of it.  The batteries are actually the only thing that will have much value at that point as the lead will still get a good price from the scrap metal dealer.  Old panels might be able to be sold as is, but it will take a sparky to evaluate them, so no-one will be buying them on sight.  This will add an overhead to disposal unless you simply sell them for scrap value (about 1kg of aluminum and some steel) or about $10 for a panel that cost you $500 new.

 So, if you look at the system as an investment, its a very rigid deal.  You either stick with it for 15 years or you loose. And your chances of getting any sort of pay-off are minimal.  Your best hope will be to get your money back but after 15 years of indexing at between 2 and 4%... its going to be eaten up either way.

8) Ideological Bullshit

The amount of lying involved in the solar debate is just gobsmaking.  Both lies of ommision and lies of commision.  Once you get past the crap, the only substance left is the ideology.  The idea that in some existantial way, solar systems are somehow better.  They produce less polution (check the production systems and the factories in China, the labour conditions, the social systems that support all this and finally the explointation required to keep the cheap labour) then make the ethical argruments...

From a technical standpoint I like the idea of collecting solar energy, but the technology is just not here yet to store it effectively (hydrogen) and recover it rapidly (fuel cells) without having a massivly parallel system that is outrageouly expensive and a maintenance headache. The inefficiencies in the availible technology just make it impractical.

Hopefully this situation will change over time, but for now, the grid is cheap, efficient, low risk, ethically neutral and very low maintenance.   Economy of scale is very hard to beat.