Wednesday, June 27, 2012

Microsoft Enterprise Library

http://msdn.microsoft.com/en-us/library/cc467894.aspx

This is not the first and I doubt the last "application building blocks" resources that I will run across. However, the interesting thing is where it fits into my headspace as a developer.

The  No-resuse model





The application implements all the functionality and essentially re-invents any wheels it needs.

The Framework model

The application uses a domain specific framework that has a pre-built set of infrastructure objects/components which all interact in a way that works but usually requires the application to accept "as-is" and either use in the way it was designed/built or not use.  It's a package deal.   They key point is that the framework is integrated. However, it may do things in a different style or contain some invariats that do not quite fit the problem at hand.  This makes for an awkward fit that may require the application to "avoid" some bits of the framework and use the platform anyway. 

In business this would be considered as horizontal integration and represent a monopoly layer that was common to a number of businesses










The Building Block model


In this model, the application re-uses smaller decoupled units.  These are stand alone components that can be used, avoided or re-invented with much less cost to the application developer. 

There are always tradeoffs, but this allows vertical integration and specialisation without carrying the pros and cons of a whole framework.






I personally find many of the frameworks to be overly heavy solutions for my particular problems. Trying to adapt my problem and solution to the framework is often more work than solving the problem and doing some re-invention.  The cost of the framworks lifecycle, community issues and other "intangible" on your project may be difficult to calculate when you are evaluating a framework as a potential solution for a project. The hidden overheads add to the maintenance cost and the uncertainty, especially with longer term projects.  The value of having a lot of the work already done tends to fall off quite rapidly when your project grows beyone one or two life cycles of the framework.  Especially if the framework evolves in a different direction than you wish to go.  The cost to gut a production system and re-engineer it because a key part of the platform has changed significantly can be ugly.


I feel that the "patterns", "building blocks" and smaller simpler re-use models are the way to go. It's almost back to the Unix philosophy.  (Many small tools that can be assembled to do the job)  It allows the developers to "own" the whole code base, down to the platform layer.  This reduces the long term uncertainty about maintainability of critical components and reduces some of the moving parts in the system.  I like this.  


Thursday, June 21, 2012

Categories as ordering structures

Categories. Groups.  Hierarchies. Networks, of Categries.

These are the usual structures for information ordering.  See this article for the seed of the idea (a bit over half way down) http://idratherbewriting.com/2012/06/11/essay-my-journey-to-and-from-wikis-why-i-adopted-wikis-why-i-veered-away-from-them-and-a-new-model-for-collaboration/

The problem with all of the above is that no matter what the structure is, hierarchy, venn groups, directed or undirected graphs, networks and other half assed structure ideas; the problem is the links between the nodes.  They are represent a weight of "1".  Conceptual, semantic, relationship etc. 

It would be more interesting to make a connnection between each leaf node and all the structure nodes (markup, meta-data, semantic...) with a weighted relationship.  (Neural net anyone).

This gives the ability to generate word clouds, semantic networks, relevance calculations etc and more importantly, they can be pre-rendered and encoded into the structure on the fly.  They can also be updated locally or as fragments.

I read a couple of articles on big data punching at Facebook and Google yesterday and the ideas are still bubbling around.

* Seed with either a relationship between a leaf node and all stem nodes and prune or randomly seed relationships and then agressivly create new ones.  

* Build structural semantics via emergent naming systems.  This allows blind structure discovery wihout having to pre-name anything. 

* Describe the relationships between leaf and stem nodes with more complex relationships than "1".   Simple weight systems, direction, traversal stats, utility for purpose, repeat visits (if your google) time of traversal, local time of traversal, reversals ( back button on browser)   all sorts of interesting data.  You could even map traversal paths and draw some conclusions about eventual destinations.  This would turn a leaf node into some kind of high value interconnect even though its not an endpoint in itself. Kind of like traffic analysis looking for high value interesections and points of failure.  If your advertising, these seem like good places to build billboards.  Not only for simple eyeball count, but for eyeballs that are on their way to specific things... which perhaps even they don't yet know. 

* Continually evolve the relationships between the nodes (wandering dendrites?)  This could be establishing and testing paths, looking for new connections, dealing with dynamic content... various different strategies.

Hmm... ideas synthesizing...



Economics with browsers

http://www.news.com.au/technology/kogan-wages-war-on-internet-explorer-users-taxed/story-e6frfro0-1226395298505

There should be more of this.  Pushback is the only way to maintain a boundary.  If someones choice costs you, push the cost back on them. 

The end of Tech Support

http://ifixit.org/2763/the-new-macbook-pro-unfixable-unhackable-untenable/

I have been whinging about the tech support business for a while now. So much that I finally heard myself say the same thing enough times for it to sink in.  Being in IT sucks.  Fixing computers is a dying art form.  There is less and less to fix.  I have not needed to turn on a multimeter in years. Usually I can fix a computer simply by swapping it with another computer more easily than I can repair something.  Its not even a case of swapping componente anymore. It's not that the parts are not availible (unless its a laptop...) but its just that they are consumer products.  No one wants or needs to care about whats inside their computer now.  There are still boxes being sold that are under-powered and under-spec for what they are sold to do; but thats a matter for comsumer law. It should not be "fixed" by an end consumer needing to "upgrade" their consumer product to do what it should have done in the first place.  This is a game for rev-heads and hard core geeks.  Everyone else should be able to buy a box, unpack it, plug it in and get on with their lives.  This is the nature of consumption.  Its about making all the choices at the point of sale rather than continually having to make choices and spend money through the life of a product.  That's called shity product design. 


Yes the geeks will lament the passing of the time when they again were vital to everyones lives.  Just like the old time geeks lamented the days when TV sets needed constant tweaking and secret knowledge of soldering irons to keep them going.  This is just the computer industry finally growing up and making products that do what they claim straight out of the box.

Now I can send my grandmother (not literally as she has passed away) out to buy a computer (iPad) and she does not have to make any choices that she does not have the capacity to make.  She knows she needs a computer to do some things... that should be enough.  Job done. 

The rest of the half-assed solutions where she needs to get a technician to try to talk her through a range of choices about monitors or RAM or hard drive specs or the trade off between dual core or higher clock speed are bullshit.  She does not have the background to make these choices and the technicians rarely have the capacity to frame the discussion in a meaningful way.  So its a bullshit exercise.  This is why "consumers" are voting with their wallets and buying all-in-one solutions that do not require some bullshit exercise where they are made to feel stupid.  Who would sign up for that if they are given a choice?  The thing to keep in mind is that most people do not give a fuck about computers, software, apps, phones, anything in the comm stack, operating systems, GUI widgets, open or closed platforms or any of the other bullshit that swirls around in the self indulgent geek-sphere.  It's all bullshit. 

And its dying out... again. 

This is a pattern people, look at photography, look at any consumer electronics, look at cars, look at toasters.  Anything that has to be accessible to "very" ordinary people has to be incredibly uncomplicated, with very simple decision points in the purchase chain and very simple ownership models.  It just has to work!  Mostly inspite of what the owners do to it. 

Reduce choice, reduce complexity, reduce flexibilty. Increase robustness, fault tollerance and deal with lack of precision from the users gracefully (or hide the effect all together). 

Think of all the bullshit "Design Scenarios" that you hear in these retrospective articles on famous device design....

(and I may be paraphrasing here) "....created it to fill that need where rich, young, hip, highly educated people with endless resources and time to appreciate the most beautiful things could fiddle with our delightful device and enrich their existance while reclining in their stark white apartment surrounded by shiny shit..."

Fuck that.  

Make a device work on a sheep station in the middle of a desert with illiterate staff who hate the device and are constantly trying to kill it to avoid work.  Make it work for a fisherman who is trying to feed his family while out killing wildlife in an ocean thats trying to kill him.  Make it work in a fucking coal mine with intermittant blackouts, toxic gas, heavy machinery and constant explosions. 

Go on. Design a fucking device that can operate in those conditions and make it work without a geek in sight. From opening the box.... otherwise fuck off. 

Geeks have had their day.... IT jobs are drying up.  All the easy stuff has been consolidated. Its only the hard bits that are still around.  Integration in small businesses. Custom hardware solutions for niche problems.  Dragging paper based systems or old legacy systems into this century.  Low margin work, with low skilled clients with poor business models.

The home computing environment is done.  Buy a box. Buy some apps. Go.  No Geek needed.



Wednesday, June 20, 2012

More on the economics of the mobile space

http://ralphbarbagallo.com/2012/06/18/the-big-business-of-small-audiences-in-mobile-games/

This article is a nice little snapshot of the changing dynamic in the mobile game/app space.  Its interesting and kinda obvious that the market has reached saturation the way it has.  It will be interesting to see if the dynamics are the same as the PC software market of years gone by when there was 1) Industry leader with "Rolls Royce" product and price tag to match. 2) Product for everyone who hated Industry leader with similar features, some compatability and a "budget" price tag.  3) A small tail of partial/imitation/amature variants as freeware/shareware/scamware with some features, high bug counts and unstable market niches. And to illustrate...

http://www.lukew.com/ff/entry.asp?1563

My guess would be that within most of the market segments in the app store, there will be a similar breakdown.  Its only between the top couple of contenders that there is any action as they fight for their market share/profit margins.  The rest will fade away as their startup capital runs out and their dreams of wealth get a little more real. These are the people factors.

The other thing that has a huge attrition effect on the market is platform changes.  It costs little to keep an app alive once its "done". However, having to update it or massage it or keep pumping marketing money into it will bleed a lossmaking product to death, so if the market dictates that there is a high maintenance cost to any app, expect to see a fairly quick rationalisation of the segments.

The problem with the app store is that the whole market place has still not explored all the possible applications that people can/should/might do with a phone/tablet.  Until there is "an app for that" that tests each possible use and gets some sense of demand/need/want... there will be a lot of "prospecting" going on.  This will keep the app store inflated with lots of new entrants and old timers.   This is similar to the model that gold mining boom towns go through.  Until its clearly proven to all the dreamers that there is no easy fortune... they will keep comming and keep pouring their energy into the app store.  Once it gets hard and the rewards reduce and have a larger and larger entry cost you will see rationalisation and consolidation.  Larger companies will be formed that can efficiently extract the value from the availible market.  These will jockey for position but generally get on with becoming more efficient.

I would expect most of the niche segments to rationalise out to about half a dozen profitable apps or less.

We will then see more rationalisation as apps consolidate across niches and cross compete by eating each others markets. 

All this little ecosystem will bubble along until there is another platform change or a new platform entrant ( Microsoft Surface?) which will upset the equilibrium by rapidly changing one or more variables. (Pull away capitalisation, start another round of gold-rush mentality, tap a new group of buyers, offer new technical features or make something obselete)

I would predict that radical changes in platform will reduce simply because it will destroy the existing application pool for the devices.  I think Apple will be very careful about breaking existing applications now that Microsoft are solidly in the game. Backward compatability has always been a bigger issue for MS than for Apple.  Android is in a total mess with the Google platform, so I would expect to see them either stabalise that or go into a death spiral now that MS is in the game.

MS will eat Googles lunch simply because they can walk into the enterprise space, if they produce a good product.  If they produce a great product, they will sew up the enterprise space and a large chunk of the pro-sumer and power user space simply be existing.  If they produce an insanely great product and it has real sex appeal, they can take a chunk out of the mid and top end of Apples pie.  The thing they cannot do is create a cheap product to compete with all the android crapware.  So the bottom end of the market will still belong to Android rubbish and cheap iPads for a while longer.  But then again, theres no money in that segment... so who cares.

Ok... FIGHT!

Storytelling tips from Pixar ex

http://storyshots.tumblr.com/post/25032057278/22-storybasics-ive-picked-up-in-my-time-at-pixar

These are some interesting ideas for writers of narrative.  Some of which are applicable for story generation systems.

Wednesday, June 13, 2012

Robot and AI Ethics in the Economist

http://www.economist.com/node/21556234

Some of this material sounds familiar....

Linkedin Cracking strategies

https://community.qualys.com/blogs/securitylabs/2012/06/08/lessons-learned-from-cracking-2-million-linkedin-passwords

This article contains some interesting hints about discovering rules using iterative search techniques. Reading it spured some ideas about discovering social and group rules tha tcould be useful for heuristic based systems.

Kickstarter Stats

http://www.appsblogger.com/kickstarter-infographic/

Interesting analysis of the Kickstarter Ecosystem.

Doom 3 Source Code Reading

http://fabiensanglard.net/doom3/index.php

This guy gets better with each release.  Fascinating work.

Support for XP and the .NET 4.0 -> 4.5 situation

I like the fact that blogger has a content warning at the front. It gives me the option to express my frustration.  Not that my level of frustration is the same as some others... but still; its mine.

http://social.msdn.microsoft.com/Forums/en/wpf/thread/c05a8c02-de67-47a9-b4ed-fd8b622a7e4a

http://visualstudio.uservoice.com/forums/121579-visual-studio/suggestions/2723735-make-net-4-5-work-on-any-os-that-supports-4-0

There shear level of political dickheadedness that has created this particular shit corner is just silly.  Implementing a policy to try to choak off enterprise use of XP artificially fast is just .... stupid.  It will not work.   But it will inflict a lot of pain on the people in the middle (IT and developers) who have to deal with  the situation.

Talk about a cluster fuck.  Who wins?   Microsoft are not going to force anyone to upgrade who was not already happy to upgrade.  The enterprises that did not want to upgrade ... will just stop patching the machines and support the platform as it exists.  The developers will be forced to run older systems without patches and they will not move to the shiny new tools that MS is inflicting on us. Less license fees again.  The fact that the shiny new tools are quite flawed... is another problem all together.

I have decided to try to distance myself from the MS stupidity.  I can really only move to pure Win32 for the GUI via wxWidgets and keep the rest of the codebase in pure C++.  Its going to be painful to excise all the WinForms... but its just not reasonable to move forward with that platform. It's dead.

Thankfully all my .NET codebases have low user counts.  Its still something I have to be aware of and plan for, as we are still in the process of upgrading to Win7.  (Fingers crossed it might be over this year some time...) I still need to upgrade all the labs...but have not been issued with licenses... so no idea when XP will actually exit my world. In the mean time... I just have to keep it all together.

Most of the small code bases should time out as their respective research projects end... so some of this problem will just go away.  Others will need to be ported or maintained on XP... fuck!  The actual number is quite small... and if I get lucky may turn out to be none.  But that still does not make the whole thing not my problem.... I have to keep it in my already over full head.  

And then I get back to all the VBA code I have floating around..... thankfully that generally just works... except for the fucking Mac ports.   There is no string of abuse long enough to express my frustration at the splintered fucking platforms that I have to work with. 

Constant string of fucking change requests... new systems to develop... students doing stupid shit and asking for the world.... I am tired and this is not something I need. 

Building a house on constantly shifting sand is a job for a fuckwit.  Toss in regular earthquakes and the whole thing goes from hard to just fucking pointless.  I need to simplify.... pure C++ + wxWidgets (Win32), some nice DirectX or OpenGL for big projects, VB.NET for the short term projects and VBA for all the office stuff that never dies.  Perl, python and Lua for recreational entertainment.  Not counting the web stuff and SQL for the DB hacking... Then there are all the embedded languages, EBasic, QuickBasic, scripting langauges in all the experimental packages. Some Matlab for spice... a little LabView every so often... Shit! I forgot about the Vista boxes and the Linux system in the Mocap lab.... Its enough to make me dust off some Java books.... *shudder*.  I wonder if the Mono project has full support for anything yet?

Damit.  It's just not simple enough.   One language to rule them all!  Is it too much to ask?



Wednesday, June 6, 2012

Shallow Fuzzing article

http://www.smh.com.au/it-pro/security-it/attack-of-the-zeroday-hunters-20120603-1zqcf.html

This is a pretty lite look at fuzzing.  More a scarry narrative for illiterates but with some vague interest.

Friday, June 1, 2012

Wolfram System Modeller

http://blog.wolfram.com/2012/05/23/announcing-wolfram-systemmodeler/

This looks handy for a whole range of research projects.  Must have a look in more detail later.

Kicking Sand out of the box....Shiny.

http://arstechnica.com/security/2012/05/anatomy-of-a-hack-6-separate-bugs-needed-to-bring-down-google-browser/?comments=1#comments-bar

This is a nice overview of a successful prize from Google for breaking the sandbox on Chrome.  Very nice work, especially the illustration.