Strategic advice to leverage new technologies
Technology is at the heart of nearly every enterprise, enabling new business models and strategies, and serving as the catalyst to industry convergence. Leveraging the right technology can improve business outcomes, providing intelligence and insights that help you make more informed and accurate decisions. From finding patterns in data through data science, to curating relevant insights with data analytics, to the predictive abilities and innumerable applications of AI, to solving challenging business problems with ML, NLP, and knowledge graphs, technology has brought decision-making to a more intelligent level. Keep pace with the technology trends, opportunities, applications, and real-world use cases that will move your organization closer to its transformation and business goals.
Recently Published
One of the most prevalent blockchains in the world, Ethereum, is poised to switch from a proof-of-work (POW) algorithm to a proof-of-stake (POS) algorithm, likely in 2017, with the release of the Casper codebase. Why does this matter? Because blockchain technology is becoming increasingly relevant and prevalent in businesses across the globe. It holds great potential to disrupt how businesses perform basic transactions, from payments, to programmable, self-executing contracts, to identity verification.
Cognitive Computing 2017
Let’s take a look at a technology poised for a breakout year: cognitive computing. There is a considerable amount of innovation in the development and application of cognitive computing across almost every industry. Consequently, companies should start examining how they can benefit from applying cognitive computing, focusing on five key domains: (1) research and discovery; (2) decision support and advisory; (3) customer engagement/customer experience management; (4) Internet of Things (IoT); and (5) cybersecurity.
Process of Things (PoT) is the new way of tapping into the worlds of IoT and business process. PoT ensures that things will not function as silos but contribute collectively to offering value-added services to enterprises such as developing smart applications around connected things and reaching out to more customers through adaptable things.
IT project performance used to be characterized by runaway projects, late delivery, exceeded budgets, reduced functionality, and questionable quality that could translate into cancellations, reduced scope, and significant rework cycles. To combat this less-than-favorable track record of project delivery, the UK’s Association for Project Management has formed a rather challenging vision of a world where all projects succeed by the year 2020. On my wish list for 2017, though, is the more modest aspiration for a new kind of better-informed and more successful project management.
The Future of Collaboration
Today’s work environment is changing rapidly, so much so that the ever-increasing velocity of innovation and technology will create a workplace five years from now that looks nothing like the one of today. The big three factors in this coming change are: the “gig” economy, or project work, becoming the norm; moving from networks to ecosystem; and augmenting human work with AI systems
The IT Scene in 2017
Last year, I declined to make specific predictions about specific technologies, what they’d disrupt, and their impact on society — a decision based on a mix of humility and cowardice. Instead, I made three predictions about challenges in the environment in which IT exists. These challenges have only become more real, making the predictions at least directionally accurate. In this article, I present updates to those predictions.
There has scarcely been a day the past few years in which there hasn’t been a breathless press release or article touting some new technological advance and how it will “literally change the world as we know it — forever.” We’ve seen in quick succession big data, 3D printing, and the Internet of Things (IoT), to mention just a few, given the mantle of being the next big societal revolutionary change agent.
Every year pundits, hardware vendors, software vendors, consultancies, academics, and even government agencies publish their technology “picks” for the following year. One cannot help but notice that just about all of the lists identify the usual suspects; cloud computing, AI, predictive analytics, wearables, and augmented reality are among about five others that make nearly everyone’s “Top 10.”
The approach I take here is a little different. I look at clusters of technologies. I do this because of the overlap among the usual suspects and because of the interrelationships among the technologies. The identification of clusters is also useful because it acknowledges the interdependencies among technologies as well as the integration and interoperability required to optimize whole technology clusters and meaningfully impact business rules, processes, and models. In other words, it’s operationally impossible to acquire, pilot, deploy, and support technologies individually because they’re always dependent upon each other and always connected.