With the retirement of www.tbig.com.au, I have chosen to move this blog to www.infodrivenbusiness.com/blog. I hope to see you there soon!
Rob Hillard is a Deloitte Partner and national leader of the Enterprise Information Management practice. The views expressed in this forum are his own.
Email Rob
With the retirement of www.tbig.com.au, I have chosen to move this blog to www.infodrivenbusiness.com/blog. I hope to see you there soon!
Posted on 09/27/2014 at 09:21 PM | Permalink | Comments (0) | TrackBack (0)
With the retirement of www.tbig.com.au, I have chosen to move this blog to www.infodrivenbusiness.com/blog. I hope to see you there soon!
Posted on 09/27/2014 at 09:21 PM | Permalink | Comments (0) | TrackBack (0)
Everyone is talking about digital disruption and the need to transform their company into a “digital business”. However, ask most people what a digital business is and they’ll talk in terms of online shopping, mobile channels or the latest wearable device.
In fact we have a long tradition of using the word “digital” as a prefix to new concepts while we adapt. Some examples include the wide introduction from the 1970s of the “digital computer” a term which no longer needs the digital prefix. Similarly the “digital mobile phone” replaced its analogue equivalent in the 1990s introducing security and many new features including SMS. The scandals caused when radio hams listened into analogue calls made by politicians seem like a distant memory!
The term digital really just refers to the use of ones and zeros to describe something in a way that is exactly repeatable rather than an inexact analogue stream. Consider, for instance, the difference between the ones and zeros used to encode the music in an audio file, which will replay with the same result now and forever in the future, compared to the physical fluctuations in the groove of an old vinyl record which will gradually degrade. Not only is the audio file more robust it is also more readily able to be manipulated into new and interesting combinations.
So it is with digital business. Once the economy has successfully made the transition from analogue to digital, it will be natural for all business to be thought of in this way. Today, however, too many people put a website and mobile app over the top of their existing business models and declare the job done.
Their reason for doing this is that they don’t actually understand what a digital business is.
Digital business separates its constituent parts to create independent data and processes which can then be rapidly assembled in a huge number of new and innovative ways. The move to digital business is actually just a continuation of the move to the information economy. We are, in fact, moving to the Information-Driven Business that puts information rather than processes at its core.
Airlines are a good example. Not that many years ago, the process of ticketing through to boarding a flight was analogue meaning that each step led to the next and could not be separated. Today purchasing, ticketing and boarding a flight are completely independent and can each use completely different processes and digital technology without impacting each other. Passenger handling for airlines is now a digital business.
What this means is that third parties or competing internal systems can work on an isolated part of the business and find new ways of adding value. For retailers this means that the pictures and information supporting products are independent of the website that presents them and certainly the payment processes that facilitate customer transactions. A digital retailer has little trouble sharing information with new logistics, payments and mobile providers to quickly develop more efficient or new routes to market.
In the 1970s and 1980s businesses were largely built-up by using thousands of processes. Over time, automation has allowed these numbers to explode. When processes required clerical functions the number of options was limited by available labour. With automation, every issue can be apparently solved by adding a process.
Where digital business is about breaking activities up into discrete parts which can be reassembled, analogue business tends to be made up of processes which are difficult or impossible to break apart.
The problem with this is that the organisation becomes a maze of processes. They become increasingly interdependent, to the point where it is impossible to break them apart.
Many businesses have put mobile and web solutions over the top of this maze. While the result can look fantastic, it doesn’t take long before the wheels fall off. Customers experience inconsistent product, delivery or price options depending on whether they ring the call centre or look online. They find that the promised stock is no longer available because the warehouse processes are not properly integrated with the online store. Without seamless customer information, they aren’t addressed with the same premium privileges or priority across all channels.
In many cases, the process maze means that the addition of a digital façade can do more harm than good.
Ironically, the reverse model of having a truly digital business with an analogue interface to the customer is not only valid but often desirable. Many customers, particularly business clients, are themselves dependent upon complex processes in their own organisations and it makes perfect sense to work with them the way that they want to work. An airline that is entirely digital in its core can still deal with corporate travel agents who operate in an analogue world.
I have previously argued that the technology infrastructure that supports digital business, cloud, is actually the foundation for business transformation (see Cloud computing should be about new business models).
Every twenty-first century business needs to define itself in terms of its information assets and differentiated intellectual property. Business transformation should focus on applying these to the benefit of all of its stakeholders including customers, staff, suppliers and shareholders.
Starting from the core and building a new enterprise around discrete, digital, business capabilities is a big exercise. The alternative, however, is to risk being pulled under in the long-term by the weight of complexity.
No matter how many extra interfaces, interim processes or quick fixes are put in place, any business that fails in this transformation challenge will ultimately be seen as offering no more than a digital façade on a fading analogue business.
Posted on 08/23/2014 at 05:04 PM in Information Overload, Web/Tech | Permalink | Comments (0) | TrackBack (0)
Technology can make us lazy. In the 1970s and 80s we worried that the calculator would rob kids of insight into the mathematics they were learning. There has long been evidence that writing long-hand and reading from paper are far superior vehicles for absorbing knowledge than typing and reading from a screen. Now we need to wonder whether that ultimate pinnacle of humanity’s knowledge, the internet, is actually a negative for businesses and government.
The internet has made a world of experience available to anyone who is willing to spend a few minutes seeking out the connections. Increasingly we are using big data analytics to pull this knowledge together in an automated way. Either way, the summed mass of human knowledge often appears to speak as one voice rather than the cacophony that you might expect of a crowd.
The crowd quickly sorts out the right answer from the wrong when there is a clear point of reference. The crowd is really good at responding to even complex questions. The more black or white the answer is, the better the crowd is at coming to a conclusion. Even creative services, such as website design, are still problems with a right or wrong answer (even if there is more than one) and are well suited to crowd sourcing.
As the interpretation of the question or weighting of the answer becomes more subjective, it becomes harder to discern the direction that the crowd is pointing with certainty. The lone voice with a dissenting, but insightful, opinion can be shouted down by the mob.
The power of the internet to answer questions is being used to test new business ideas just as quickly as to find out the population of Nicaragua. Everything from credit cards to consumer devices are being iteratively crowd sourced and crowd tested to great effect. Rather than losing months to focus groups, product design and marketing, smart companies are asking their customers what they want, getting them involved in building it and then getting early adopters to provide almost instant feedback.
However, the positive can quickly turn negative. The crowd comments early and often. The consensus usually reinforces the dominant view. Like a bad reality show, great ideas are voted off before they have a chance to prove themselves. If the idea is too left-field and doesn’t fit a known need, the crowd often doesn’t understand the opportunity.
In the 1960s and 1970s, many scientists argued that an artificial brain would display true intelligence within the bounds of the twentieth century. Research efforts largely ground to a halt as approach after approach turned out to be a dead-end.
Many now argue that twenty-first century analytics is bridging the gap. By understanding what the crowd has said and finding the response to millions, hundreds of millions and even billions of similar scenarios the machine is able to provide a sensible response. This approach even shows promise of meeting the famous Turning test.
While many argue that big data analytics is the foundation of artificial intelligence, it isn’t providing the basis of brilliant or creative insight. IBM’s Watson might be able to perform amazing feats in games of Jeopardy but the machine is still only regurgitating the wisdom of the crowd in the form of millions of answers that have been accumulated on the internet.
No amount of the crowd or analytics can yet make a major creative leap. This is arguably the boundary of analytics in the search for artificial intelligence.
For the first time digital disruption, using big data analytics, is putting white collar jobs at the same risk of automation that blue collar worker have had to navigate over the last fifty years. Previously we assumed process automation would solve everything, but our organisations have become far too complex.
Business process management or automation has reached a natural limit in taking out clerical workers. As processes have become more complex, and their number of interactions has grown exponentially, it has become normal for the majority of instances to display some sort of exception. Employees have gone from running processes to handling exceptions. The change in job function has largely masked the loss of traditional clerical works since the start of mass rollout of business IT.
Most of this exception handling, though, requires insight but no intuitive leap. When asked, employees will tell you that their skill is to know how to connect the dots in a standard way to every unique circumstance.
Within organisations, email and, increasingly, social platforms have been the tools of choice for collaboration and crowdsourcing solutions to individual process exceptions. Just as big data analytics is automating the hunt for answers on the internet, it is now starting to offer the promise of the same automation within the enterprise.
In the near future, applications driven by big data analytics will allow computers to move from automating processes to also handling any exceptions in a way that will feel almost human to customers of everything from bank mortgages to electric utilities.
Just as many white collar jobs have moved from running processes in the 70s and 80s to handling their exceptions in the 90s and new millennium, these same jobs need to move now to find something new.
At the same time, the businesses they work for are being disrupted by the same digital forces and are looking for new sources of revenue.
These two drivers may come together to offer an opportunity for those who spent their time handling exceptions either for customer or internal processes. Future opportunities are in spotting opportunities in business through intuitive insights and creative leaps and turning them into product or service inventions rather than seeking permission from the crowd who will force a return to the conservative norm.
Perhaps this is why design thinking and similar creative approaches to business have suddenly joined the mainstream.
Posted on 07/26/2014 at 12:55 PM in Information Overload, Web/Tech | Permalink | Comments (0) | TrackBack (0)
For years now the physics community has been taking the leap into computer science through the pursuit of the quantum computer. As weird as the concepts underpinning the idea of such a device are, even weirder is the threat that this machine of the future could pose to business and government today.
There are many excellent primers on quantum computing but in summary physicists hope to be able to use the concept of superposition to allow one quantum computer bit (called a “qubit”) to carry the value of both zero and one at the same time and also to interact with other qubits which also have two simultaneous values.
A quantum computer would be hoped to come up answers to useful questions with far fewer processing steps than a conventional computer as many different combinations would be evaluated at the same time. Algorithms that use this approach are generally in the category of solution finding (best paths, factors and other similar complex problems).
As exciting as the concept of a quantum computer sounds, one of the applications of this approach would be a direct threat to many aspects of modern society. Shor’s algorithm provides an approach to integer factorisation using a quantum computer which is like a passkey to the encryption used across our digital world.
The cryptography techniques that dominate the internet are based on the principle that it is computationally infeasible to find the factors of a large number. However, Shor’s algorithm provides an approach that would crack the code if a quantum computer could actually be built.
We’re familiar with businesses of today being disrupted by new technology tomorrow. But just as weird, as the concept of quantum superposition is the possibility that the computing of tomorrow could disrupt the business of today!
We are passing vast quantities of data across the internet. Much of it is confidential and encrypted. Messages that we are confident will remain between the sender and receiver. These include payments, conversations and, through the use of virtual private networks, much of the internal content of both companies and government.
It is possible that parties hoping to crack this content in the future are taking the opportunity to store it today. Due to the architecture of the internet, there is little to stop anyone from intercepting much of this data and storing it without anyone having any hint of its capture.
In the event that a quantum computer, capable of running Shor’s algorithm, is built the first thought will need to be to ask what content could have been intercepted and what secrets might be open to being exposed. The extent of the exposure could be so much greater than might appear at first glance.
There is one commercially available device marketed as a quantum computer, called the D-Wave (from D-Wave Systems). Sceptics, however, have published doubts that it is really operating based on the principles of Quantum Computing. Even more importantly, there is no suggestion that it is capable of running Shor’s algorithm or that it is a universal quantum computer.
There is a great deal of evidence that the principles of quantum computing are consistent with the laws of physics as they have been uncovered over the past century. At the same time as physics is branching into computing, the information theory branch of computing is expanding into physics. Many recent developments in physics are borrowing directly from the information discipline.
It is possible, though, that information theory as applied to information management problems could provide confidence that a universal quantum computer is not going to be built.
Information entropy was initially constructed by Claude Shannon to provide a tool for quantifying information. While the principles were deliberately analogous to thermal entropy, it has subsequently become clear that the information associated with particles is as important as the particles themselves. Chapter 6 of my book, Information-Driven Business, explains these principles in detail.
It turns out that systems can be modelled on information or thermal entropy interchangeably. As a result, a quantum computer that needs to obey the rules of information theory also needs to obey the laws of thermal entropy.
The first law of thermodynamics was first written by Rudolf Clausius in 1850 as: “In all cases in which work is produced by the agency of heat, a quantity of heat is consumed which is proportional to the work done; and conversely, by the expenditure of an equal quantity of work an equal quantity of heat is produced”.
Rewording over time has added sophistication but fundamentally, the law is a restatement of the conservation of energy. Any given system cannot increase the quantity of energy or, as a consequent of the connection between thermal and information entropy, the information that it contains.
Any computing device, regardless of whether it is classical or quantum in nature, consumes energy based on the amount of information that is being derived as determined by the information entropy of the device. While it is entirely possible that massive quantities of information could be processed in parallel, there is no escaping the requirement to adhere to this requirement with a quantum computer truly delivering this level of computing requiring the same order of energy as the thousands or even millions of classical computers required to deliver the same result.
I anticipate that developers of quantum computers will either find that the quantity of energy required to process is prohibitive or that their qubits will constantly frustrate their every effort to maintain coherence for long enough to complete useful algorithms.
Definitely! In a future post I propose to create a scorecard tracking the predictions I’ve made over the years.
However, anyone who claims to really understand quantum mechanics is lying. Faced with the unbelievably complex wave functions required for quantum mechanics which seem to defy any real world understanding, physicist David Mermin famously advised his colleagues to just “Shut up and calculate!”.
Because of the impact of a future quantum computer on today’s business, the question is far from academic and deserves almost as much investment as the exploration of these quantum phenomena do in their own right.
At the same time, the investments in quantum computing are far from wasted. Even if no universal quantum computer is possible, the specialised devices that are likely to follow the D-Wave machine are going to prove extremely useful in their own right.
Ultimately, the convergence of physics and computer science can only benefit both fields as well as the business and government organisations that depend on both.
Posted on 06/22/2014 at 08:47 PM in Information Overload, Web/Tech | Permalink | Comments (0) | TrackBack (0)
Is there a future for careers in Information Technology? Globally, professional societies such as the British Computer Society and the Australian Computer Society have long argued that practitioners need to be professionals. However, there is a counter-argument that technology is an enabler for all professions and is more generally a capability of many rather than a profession of the few.
At the same time, many parents, secondary school teachers and even tertiary educators have warned students that a Technology career is highly risky with many traditional roles being moved to lower cost countries such as India, China and The Philippines. Seeing headlines in the newspapers in recent years headlining controversy over the use imported works in local Technology roles has only served to further unsettle potential graduates.
Organisations increasingly realise that if they don’t encourage those who have information and insight about the future of technology in their business, they be creating a lumbering hierarchy that is incapable of change.
How should companies seek out those innovations that will enable the future business models that haven’t been invented yet? Will current technology savings cause “pollution” that will saddle future business initiatives with impossible complexity? Is the current portfolio of projects simply keeping the lights on or is it preparing for real change? Does the organisation have a group of professionals driving change in their business in the years to come or do they have a group of technicians who are responding without understanding why?
These questions deeply trouble many businesses and are leading to a greater focus on having a group of dedicated technology professionals at every level of the organisation and often dispersed through the lines of business.
The recognition of the need for these change agents should answer the question on the future of the profession. At a time when business needs innovation which can only achieved through technology, society is increasingly worried about a future where their every interaction might be tracked.
While the Information Technology profession has long talked about the ethics of information and privacy, it is only recently that society is starting to care. With the publicity around the activities of government and big business starting to cause wide concern, it is likely that the next decade will see a push towards greater ownership of data by the customer, more sophisticated privacy and what is being dubbed “forget me” legislation where companies need to demonstrate they can completely purge all record of an individual.
While every business will have access to advice at senior levels, it is those who embed Information Technology professionals at every level through their organisation that will have the ability to think ahead to the consequences of each decision.
These decisions often form branches in the road. While requirements can often be met in different, but apparently similar paths, the difference between the fastest route and the slowest is sometimes measured in orders of magnitude. Sometimes these decisions turn out to be difference between success and failure. A seemingly innocuous choice to pick a particular building block, external interface or language can either be lauded or regretted many years later.
Ours is a career that has invited many to join from outside and the possibilities that the digital and information economy create had enticed many who have tinkered to make Information Technology their core focus. While this is a good thing, it is critical that those relying on technology skills can have confidence in the decisions that are being made both now and in the future.
Practitioners who have developed their knowledge in an ad-hoc way, without the benefit of testing their wider coverage of the discipline, are at risk of making decisions that meet immediate requirements but which cut-off options for the future or leave the organisation open to structural issues which only become apparent in decades to come. In short, these people are often good builders but poor architects.
Casual observers of the industry can be forgiven for thinking that the constant change in technology means that skills of future practitioners will be so different to those of today as to make any professional training irrelevant. Anyone who holds this view would be well served by reading relevant Technology articles from previous eras such as the 1980s when there was a popular perception that so-called “fourth generation languages” would mean the end of computer programming.
While the technical languages of choice today are different to those of the 1970s, 80s and subsequent decades, the fundamental skills are the same. What’s more, anyone who has developed professional (as opposed to purely technical) skills as a developer using any language can rapidly transition to any new language as it becomes popular. True Technology professionals are savvy to the future using the past as their guide and make good architecture their goal.
Certainly the teaching and foundations of Technology need to change. There has been much too much focus on current technical skills. The successful Technologist has a feel for the trends based on history and is able to pick-up any specific skill as needed through their career.
Senior executives, regardless of their role, express frustration about the cost and complexity of doing even seemingly simple things such as preparing a marketing campaign, adding a self-service capability or combining two services into one. No matter which way you look at it, it costs more to add or change even simple things in organisations due to the increasing complexity that a generation of projects have left behind as their legacy (see Value of decommissioning legacy systems).
It should come as no surprise that innovation seems to come from Greenfield start-ups, many of which have been funded by established companies whose own legacy stymies experimentation and agility.
This need to start again is neither productive nor sustainable. Once a business accepts the assertion that complexity caused by the legacy of previous projects is the enemy of agility, then they need to ask whether their ICT capabilities are adding to the complexity while solving immediate problems or if they are encouraging ICT professionals to create solutions that not only meet a need but also simplify the enterprise in preparation for an unknown tomorrow.
Posted on 05/25/2014 at 08:22 PM in Information Overload, Web/Tech | Permalink | Comments (0) | TrackBack (0)
Just how productive are Chief Information Officers or the technology that they manage? With technology portfolios becoming increasingly complex it is harder than ever to measure productivity. Yet boards and investors want to know that the capital they have tied-up in the information technology of the enterprise is achieving the best possible return.
For CIOs, talking about value improves the conversation with executive colleagues. Taking them aside to talk about the success of a project is, even for the most strategic initiatives, usually seen as a tactical discussion. Changing the topic to increasing customer value or staff productivity through a return on technology capital is a much more strategic contribution.
There are all sorts of productivity measures that can be applied to individual systems, but they are usually based on the efficiency of existing processes which leads to behaviours which reduce flexibility. The future of business and government depends on speed of response to change, not how efficiently they deal with a static present.
Businesses invest in information systems to have the right information at the right time to support decisions or processes. Information that is used is productive while information that is collected, but poorly applied, is wasted or unproductive.
However, to work out what proportion of information is being used there needs to be a way to quantity it.
There is a formal way to measure the quantity of information. I introduce this extensively in chapter 6 of Information-Driven Business.
The best way to understand “quantity” in terms of information is to count the number of artefacts rather than the number of bits or bytes required to store them. The best accepted approach to describing this quantity is called “information entropy” which, confusingly, uses a “bit” as its unit of measure which is a count of the potential permutations that the system can represent.
A system that holds 65,536 names has just 16 “bits” of unique information (log265536). That might sound strange given that the data storage of 65,536 names might use of the order of 6MB.
To understand why there only 16 bits of unique information in a list of 65,536 names consider whether the business uses the spelling of the names of if there is any additional insight being gained from the data that is stored.
Knowing how much information there is in a system opens up the opportunity to find how much information is being productively used. The amount of information being used to drive customer or management choices is perhaps best described as “decision entropy”. The decision entropy is either equal or less than the total information entropy.
An organisation using 100% of their available information is incredibly lean and nimble. They have removed much of the complexity that stymies their competitors (see Value of decommissioning legacy systems).
Of course, no organisation productively uses all of the information that they hold. Knowing that holding unproductive information comes at a cost to the organisation, the CIO can have an engaging conversation with fellow executives about extracting more value from existing systems without changing business processes.
When looking at how business reports are really used, and how many reports lie unread on management desks, there is a lot of low hanging fruit to be picked just by improving the way existing business intelligence is used.
Similarly, customer systems seldom maximise their use of hints based on existing information to guide buyers to close the best available offer. A few digital enhancements at the front line can bring to the surface a vast array of otherwise unused information.
Globally, CIOs are finding themselves pushed down a rung in the organisational ladder. This is happening at the very same time that technology is moving from the back office to become a central part of the revenue story through digital disruption.
CIOs are not automatically entitled to be at the executive table. They have to earn the right by contributing to earnings and business outcomes. One of the best discussions for a CIO to focus on is increasing productivity of the capital tied-up in the investments that have already been made in the systems that support staff and customers.
Posted on 04/20/2014 at 07:52 PM in Information Overload, Web/Tech | Permalink | Comments (0) | TrackBack (0)
With a little work, social networks have the potential to be as valuable in confirming an identity as a passport. It is the power of the crowd that can prove the integrity of the account holder, perhaps best described as crowdsourcing identity.
There are usually two goals of identity. The first is to confirm you are you who you say you are and the second is to work out your relationship to other people.
Social networks can solve both. We’re all familiar with the burgeoning number of websites that allow you to “login” with Facebook, LinkedIn or Twitter. The vast majority, though, are simply using a convenient approach to challenge and permit access. Rather than maintaining a new set of credentials, they are using a mechanism that maintains those sensitive details externally.
This is to be applauded and is entirely consistent with the objectives of cloud to share services rather than build complete vertical solutions from the ground up. However, just accepting a social network’s credentials only uses a fraction of the capability that aligning with these services offers.
In past decades, our grandparents carefully checked the telephone directory when it came out to make sure all their family and friends were listed correctly. With the whole city doing the same thing, any mistakes (or even deliberate fraudsters) were pretty quickly uncovered.
Today, phone directories are barely looked at and are, at best, incomplete. Once you get through an ID check, your details are entirely within your control and very likely to go unchallenged.
Social networks are different. While the profile that is created is self-regulated, its exposure to the friends forces a level of honesty. It may be easy to create a false identity, but a profile that is fully connected with the network and is actively maintained is much harder to fake for an extended period. Some of the things to look for include: levels of activity, numbers of “friends” or connections who are themselves active and connected, cross-posting and the amount of detail on the profile.
Many employers now prefer LinkedIn to a CV for the simple reason that it is harder to fake qualifications and experience. A CV prepared for an employer requires reference checking and verification that often doesn’t happen.
The media is full of stories of senior people who have been caught claiming qualifications that they never completed. Compare that to the profile on LinkedIn where there are usually hundreds of connections, any one of which will call out if a false qualification is claimed or the description of employment is exaggerated.
Moreover, for most employers the network of connections in common is extensive and a whole range of potential points of verification are added, even if confidentiality requires waiting until after employment has commenced. Just the knowledge that this is likely to happen discourages would-be fakes.
Just as people will grab their smartphone before almost any other possession in an emergency, it seems that they value their social media login credentials above almost any other password.
People will often happily give out their credentials for video streaming services (such as Netflix). They allow their trusted family members to use their banking user details. They will even allow support staff at work to have their network password. But ask for access to their Facebook or LinkedIn account and they will refuse as it sits at the centre of their trusted friend network. Access to this core is just too sensitive to share.
In the future we could see building security where you “login with Facebook” and banks using social media credentials as part of identifying a customer when creating a new account.
Whether a business or government service, it is important that the consumer or citizen receives fair value for using social media to identify themselves. The key is full disclosure.
If all that the Google, Facebook, Twitter or LinkedIn account is doing is providing access then the exchange is one of convenience. For the user, there is one less password to maintain and the site owner there is one less point of exposure.
However, it may be that the site or service needs to know about relationships, locations or other details which are maintained in the service. Full disclosure allows the user to feel confident on what is being used and why. If the use is appropriate to the user’s needs then this approach provides a way of updating their personal details without their filling out as many forms.
Many online services need not have any username or password data at all and those that do may only need it for those customers or citizens who want to opt-out of the social media revolution. Arguably, this last group maintain less of their details online and are usually less exposed in the event of security breach.
Good practice suggests using social media as part of an identity service rather than government or business trying to create yet another master, standalone, identity solution of their own.
Posted on 03/28/2014 at 08:37 PM in Information Overload, Web/Tech | Permalink | Comments (0) | TrackBack (0)
With a little work, social networks have the potential to be as valuable in confirming an identity as a passport. It is the power of the crowd that can prove the integrity of the account holder, perhaps best described as crowdsourcing identity.
There are usually two goals of identity. The first is to confirm you are you who you say you are and the second is to work out your relationship to other people.
Social networks can solve both. We’re all familiar with the burgeoning number of websites that allow you to “login” with Facebook, LinkedIn or Twitter. The vast majority, though, are simply using a convenient approach to challenge and permit access. Rather than maintaining a new set of credentials, they are using a mechanism that maintains those sensitive details externally.
This is to be applauded and is entirely consistent with the objectives of cloud to share services rather than build complete vertical solutions from the ground up. However, just accepting a social network’s credentials only uses a fraction of the capability that aligning with these services offers.
In past decades, our grandparents carefully checked the telephone directory when it came out to make sure all their family and friends were listed correctly. With the whole city doing the same thing, any mistakes (or even deliberate fraudsters) were pretty quickly uncovered.
Today, phone directories are barely looked at and are, at best, incomplete. Once you get through an ID check, your details are entirely within your control and very likely to go unchallenged.
Social networks are different. While the profile that is created is self-regulated, its exposure to the friends forces a level of honesty. It may be easy to create a false identity, but a profile that is fully connected with the network and is actively maintained is much harder to fake for an extended period. Some of the things to look for include: levels of activity, numbers of “friends” or connections who are themselves active and connected, cross-posting and the amount of detail on the profile.
Many employers now prefer LinkedIn to a CV for the simple reason that it is harder to fake qualifications and experience. A CV prepared for an employer requires reference checking and verification that often doesn’t happen.
The media is full of stories of senior people who have been caught claiming qualifications that they never completed. Compare that to the profile on LinkedIn where there are usually hundreds of connections, any one of which will call out if a false qualification is claimed or the description of employment is exaggerated.
Moreover, for most employers the network of connections in common is extensive and a whole range of potential points of verification are added, even if confidentiality requires waiting until after employment has commenced. Just the knowledge that this is likely to happen discourages would-be fakes.
Just as people will grab their smartphone before almost any other possession in an emergency, it seems that they value their social media login credentials above almost any other password.
People will often happily give out their credentials for video streaming services (such as Netflix). They allow their trusted family members to use their banking user details. They will even allow support staff at work to have their network password. But ask for access to their Facebook or LinkedIn account and they will refuse as it sits at the centre of their trusted friend network. Access to this core is just too sensitive to share.
In the future we could see building security where you “login with Facebook” and banks using social media credentials as part of identifying a customer when creating a new account.
Whether a business or government service, it is important that the consumer or citizen receives fair value for using social media to identify themselves. The key is full disclosure.
If all that the Google, Facebook, Twitter or LinkedIn account is doing is providing access then the exchange is one of convenience. For the user, there is one less password to maintain and the site owner there is one less point of exposure.
However, it may be that the site or service needs to know about relationships, locations or other details which are maintained in the service. Full disclosure allows the user to feel confident on what is being used and why. If the use is appropriate to the user’s needs then this approach provides a way of updating their personal details without their filling out as many forms.
Many online services need not have any username or password data at all and those that do may only need it for those customers or citizens who want to opt-out of the social media revolution. Arguably, this last group maintain less of their details online and are usually less exposed in the event of security breach.
Good practice suggests using social media as part of an identity service rather than government or business trying to create yet another master, standalone, identity solution of their own.
Posted on 03/28/2014 at 08:37 PM in Information Overload, Web/Tech | Permalink | Comments (0) | TrackBack (0)
The PC era is arguably over and the age of ubiquitous computing might finally be here. Its first incarnation has been mobility through smartphones and tablets. Many pundits, though, are looking to wearable devices and the so-called “internet of things” as the underlying trends of the coming decade.
It is tempting to talk about the internet of things as simply another wave of computing like the mainframe, mid-range and personal computer. However there are as many differences as there are similarities.
While the internet of things is arguably just another approach to solving business problems using computing devices, the units of measure are not so directly tied to processing power, memory or storage. The mass of interconnections mean that network latency, switching and infrastructure integration play just as much a part, for the first time divorcing a computing trend from “Moore’s Law” and the separate but related downward trend of storage costs.
The internet of things brings the power of the information economy to every part of society, providing an opportunity to repeat many of the gains that business and government experienced through the last decade of the twentieth century. Back then there was a wave of new applications that allowed for the centralisation of administration, streamlining of customer service and outsourcing of non-core activities.
Despite all of the apparent gains, however, most infrastructure has remained untouched. We still drive on roads, rely on electricity and utilise hospitals that are based on technology that really hasn’t benefited from the same leap in productivity that has supercharged the back office.
It’s not hard to build the case to drive our infrastructure through the network. Whether it is mining, transport, health or energy, it is clear that there are mass benefits and efficiencies that are there for the taking. Even simple coordination at a device level will run our communities and companies better for everyone’s benefit. Add in the power of analytics and we can imagine a world that is dancing in synchronisation.
However, just as this new wave of computing doesn’t adhere to Moore’s Law, it also lacks the benefit of rapid consumer turnover. We’ve accepted the need to be constantly replacing our PCs and smartphones, driving in turn the growth of an industry. But it’s one thing to upgrade one or two devices per person every few years, it is quite another to replace the infrastructure that surrounds us.
Much of what we work with day-to-day in roads, rail, energy and all of the other infrastructure around us has a lifespan that is measured in decades or even half centuries. Even with a compelling business case, it is not likely to be enough to replace embedded equipment that still has decades of economic value ahead of it.
If we don’t want to wait until our grandchildren are old to benefit from the internet of things, we have to find a way to implement it in a staggered way.
Anyone who has renovated their house will have considered whether to build in new technology. For most, network cabling is a basic, but centralised control of lighting and other devices is a harder ask. Almost anything that is embedded in the walls runs the risk of being obsolete even before the builder hands over the keys.
The solution is independent and interoperable devices. Rather than building-in security cameras, consumers are embracing small portable network cameras that they can access through cloud-based services. The same providers are increasingly offering low-cost online light switches that don’t require anything more than an electrician to replace the faceplate. This new generation of devices can be purchased as needed rather than all at once and raise little risk of uneconomic obsolescence.
Google’s purchase of NEST appears to be an investment in this incremental, cloud-based, future of interconnected home-based devices. By putting it in their stable of technologies, it is a fair bet that they see the value being in offering analytics to optimise and better maintain a whole range of services.
Just as companies focused on the consumer at home are discovering an approach that works, so too are those vendors that think about bigger infrastructure including energy, mining and major transport assets.
The car industry is going to go through a revolution over the next decade. After years of promise, new fuel technologies are bearing fruit. It’s going to be both an exciting and uncertain ride for all involved. Hybrid, electric pluggable, electric swappable and hydrogen all vie for roles in the future.
Just as important than as fuel technology is the ability to make cars autonomous. So-called “range anxiety”, the fear of running out of power in an electric car, is eliminated if the car’s computer makes the driving decisions including taking responsibility for ensuring an adequate charge or access to charging stations.
Arguably the most exciting developments are in the car software rather than hardware. Like smartphones, manufacturers are treating their vehicles as platforms for future apps. New cars are including technology that is beyond the capabilities of current software such as that needed to run an autonomous vehicle.
Early beneficiaries are the insurance industry (telematics), car servicing providers (who can more actively target customers) and a whole range of innovators who are able to invent solutions that as drivers we never even knew we needed.
Perhaps a tie-up between innovators like Tesla and Apple will show what connected cars and infrastructure could do!
Unlike previous waves of computing, the internet of things is going to take many years to reach its full potential. During this transition business opportunities that are unimaginable today are going to emerge. Many of these lie in the data flowing across the internet of things that is genuinely big (see It’s time for a new definition of big data).
These new solutions can be thought of as permutations of cloud services (see Cloud computing should be about new business models). Those wanting to try to imagine the world of possibilities should remember that the last twenty years have taught us that the consumer will lead the way and that incumbents are seldom the major innovators.
However, incumbent energy and infrastructure companies that do want to take the lead could do well by partnering with consumer technology companies. The solutions they come up with are likely to include completely different relationships between a range providers, services and customers.
Posted on 02/22/2014 at 04:43 PM in Information Overload, Science, Web/Tech | Permalink | Comments (0) | TrackBack (0)