maandag 6 oktober 2014

Smart Grids leverage Social Innovation in Energy and Utilities





Although a lot has been said about Smart Grids Projects in the Netherlands (Overview Netherlands) and Europe (Overview Europe), the real adoption of smart grids will eventually depend on social acceptance. 

It is of relevance to look at the social aspects of smart grid projects, in particular from the perpective of Sustainable Energy Communities, the groups of people that are involved in smart grid projects.

As a phenomenon, Sustainable Energy Communities play a key role in the acceptance and succes of sustainable energy initiatives. These communities invest and exploit microgeneration and reconfigure social practices. They thus can be regarded as incubators of social innovation in the field of energy and utilities.

Trigger: Arentsen, M. Bellekom, S. Power to the people: local energy initiatives as seeds of innovation? Energy, Sustainability and Society (4),1-12, 2014 


dinsdag 9 september 2014

BigData challenges similar to C64 hacking in the Demo scene




Yesterday I stumbled upon a presentation-movie 'behind the scenes of C64 demo'. It was about programming the Commodore64 microcomputer, the most popular homecomputer of the 80's. The Speaker, named Ninja/the dreams is an assembly master and still getting even more out his C64 each day!

In the presentation he stresses a few fundamental  programming concepts which are in my opinion quite similar and relevant to the challenges we face today in the field of BigData.

Remarkably beautiful (and amazingly feeding my personal enthousiasm) is the fact that the background of this sotry is the so called ‘Demoscene’.  The paradigm where art, math, coding and experience meet.

The first similarirty between old-skool-demo-coding and programming applications in the field of BigData is 'Dealing with limitations': In the art of demo-programming, the programmer (coder) is faced with limitations of the machine (regardless whether it is a C64, a GPU card, or a quad-core PC he/she is coding for). It is the coders’ combination of skills in programming, math, smartness and art, that makes the difference in dealing with these limitations. The succes comes with the 'wow-experience' of the coder (and the public). Today, similar challenges hold for applications in the field of BigData. There are limitations to be able to deal with the amount and variety of data. Similar skills and value of algorithms make the difference of an applications' success.
The second similarity showed up when I saw the the killer line in his presentation:  'Data is code and Code is Data'. This holds for self-modifying code on the C64, as he explains, but -of-course- also for BigData analytics applications. Let me try to explain this  in a few lines: Consider the common see-do/plan-act cycle, where data is taken from a particular environment, then information is extracted from that data, followed by feeding this information into decision support systems and finally, taking actions based on those decisions, back into the same environment as where the data was taken. These steps come with algorithms that use parameters being derived from that same environment, either obtained automatically (thanks to a learning algorithms) or empirically retrieved by humans observing the same environment. The data is changed by the code, and the code is tuned by the data. The concept holds for that old C64 but also for our todays' paradigm of BigData.

Sometimes I think that there are only a few types on the dancefloor in this universe: humans, machines, algorithms and data...



vrijdag 4 april 2014

BigData-Analytics-projecten vragen een volstrekt andere aanpak dan andere ICT-projecten

Gelezen in Harvard Business Review (HBR): "Why IT Fumbles Analytics" (D.A. Marchand, J. Peppard)

"Ondanks grote investeringen in data scientists en ICT projecten, worstelen veel organisaties met het kapitaliseren van de grote hoeveelheden data die voor hun beschikbaar zijn.
De oorzaak ligt volgens Marchand en Peppard in de wijze waarop de projecten worden bedacht en uitgevoerd, waarbij de focus -verkeerd- ligt op het ontwerpen, bouwen en uitrollen volgens een traditioneel stramien (gebaseerd op tijd, budget en over-all planning).
De oplossing ligt in het volgen van een andere projectbenadering met een focus op a) het begrijpen hoe mensen informatie creeren en gebruiken, en b) uitvoering in de vorm van experimenten, focus op hypotheses en het ontwikkelen van kennis door iteratieve processen."



Uit eigen ervaring wil ik zelf aan die aanpak nog twee elementen toevoegen:
c) Analytics projecten moeten worden geaddresseerd in de vorm van een gezamenlijk aanpak met andere partijen in het probleem-domein, bijvoorbeeld kleine kennis organisaties (niche partijen), data-providers (zowel open als gesloten bronnen), academische instituten of vakgroepen, en niet te vergeten: de klant. Zeg maar co-creatie-to-the-max.
d) In het proces als in de oplossing (bijv. in de modellen) moet gestreeft worden naar optimale samenwerking tussen mens en machine. Ik refereer graag naar de inzichten beschreven in het boekje van Bennie Mols, "Turings Tango", ISBN 9789046812372. Het is bijvoorbeeld goed om tussen twee iteraties de uitkomsten van de computermodellen te vergelijken met de bevindingen van de mens. Ook is het goed om de modellen niet alleen te maken op basis van de data, maar ook op basis van de expertkennis van de mens. Het gaat daarbij om optimale wisselwerking tussen mens en computer. Partijen die dat goed door hebben zijn  het Alan Turing Institute Almere (ATIA), Synerscope en PersuasionAPI.

 Wico Mulder,
4-4-2014

Ps: dank Hans Moonen voor het duiden van het Artikel. 
Ps2: Auteursrechten belemmeren mij het linken naar het HBR-artikel. Excuses. Wel vond ik deze blog waar de volledige tekst ook te lezen.





dinsdag 4 maart 2014

Advanced Analytics crucial discipline in a Data Driven Society


We live in a ‘data driven society’ where BigData, today’s hot-topic, manifests itself in many industrial domains, such as healthcare, public safety, finance and other commercial areas. BigData is commonly addressed in terms of 4 V’s: Volume, Variety, Velocity and Veracity. Volume and Variety imply organisations to have a firm grip on information management,  Velocity motivates for fast, optimized algorithms and Veracity focuses on governance, compliance, context and information-value. It is here where the discipline of Advanced Analytics plays a crucial role. But what actually is Advanced Analytics?

In general, analytics refers to the discipline that encompasses all processes and techniques to extract information and models from data.  In the IT industry Analytics is understood as the process or method of logical analysis leading to the discovery and communication of meaningful patterns in data using statistics, computer programming and operations research to quantify and optimize performance.

The discipline Advanced Analytics, includes advanced techniques and methodologies such as data-modelling, machine learning, and data mining in order to detect patterns or finding relationships in the data. Typical techniques include clustering, decision tree building, text analysis, context mining, trend analysis, and predictive modeling. Interesting aspects are to find ways where prior information known by humans can be combined with data sources in such a way that it optimizes the learning processes and deduction of statistical or semantic patterns.
Most organisations stress the business expectations of advanced analytics by saying “advanced analytics involves data-analysis and sophisticated quantitative methods to produce insights that traditional approaches (e.g. business intelligence) are unlikely to discover”. Although I agree on this, it is here where I want to put a strong remark: the discipline of advanced analytics is not limited to business processes, organisational management and customer behaviour. The value of advanced analytics manifest itself in other areas, such as public safety, healthcare having goals related to situational awareness and personalized medicine.

The discipline of Advanced Analytics will become a commonality within 2-5 years. In the meantime, while the heat of the BigData hype will settle down to normal temperatures, we might start to think about the next thing: "what would be the role and influence of individual humans in a connected, data driven and perhaps predictable world?"

Wico Mulder
4-3-2014