Strategic advice to leverage new technologies
Technology is at the heart of nearly every enterprise, enabling new business models and strategies, and serving as the catalyst to industry convergence. Leveraging the right technology can improve business outcomes, providing intelligence and insights that help you make more informed and accurate decisions. From finding patterns in data through data science, to curating relevant insights with data analytics, to the predictive abilities and innumerable applications of AI, to solving challenging business problems with ML, NLP, and knowledge graphs, technology has brought decision-making to a more intelligent level. Keep pace with the technology trends, opportunities, applications, and real-world use cases that will move your organization closer to its transformation and business goals.
Recently Published
This article discusses intelligent apps that offer good examples of how cognitive computing and other advanced analytics technologies can be applied to help organizations redefine customer engagement and assist both consumers and employees with making complex decisions.
Analytics by the Footprint
As big data and analytics move into more diverse applications, across different usage types and different industries, it becomes increasingly important to categorize the analytic processes themselves. The progress of digital business means that processes applied in one realm are easily transferred to another -- provided that the operational similarities can be uncovered. This demands a more inclusive taxonomy, as we explore in this Executive Update.
In this article, the authors present their perspective on why there is a need to build a greater understanding of technology's impact and to judiciously manage that impact by steering our technology consumption in a way that not only realizes the anticipated value of evolving technologies, but also ensures balance and sustainability for our planet.
An alien reading today's IT industry and business reporting would be hard-pressed to define IT as anything other than the latest "app" and the "cloud," lurching from one security breach to the next. Most Cutter IT Journal readers should know better. The realm of IT is vast, and expanding, as more and more of our analog world is reduced to bits rather than atoms. One way to make sense of the vastness is to use a layered abstraction, which I call the uber-stack.
Five Enablers of Web Ubiquity
The first 25 years of the Web clearly demonstrated that connectivity and problem solving can be cost-effectively linked. It's now possible to communicate, shop, and learn on the Web. We can find answers, relationships, and games on the Web, and for those of us who desire a more surrealistic experience, we can immerse ourselves in virtual worlds.
Over the last decade, the growing adoption of personal devices has led to many people having their first and only computer experience on a phone or tablet. Over the coming decade, these devices will drop in price and proliferate, with a large percentage of the earth's population having more than one personal computer device (phone, tablet, watch, glasses, etc.). Unfortunately, the very devices that are designed to help us communicate and work better together are causing us challenges with in-person engagement. Human face-to-face contact has already been altered as a result of the constant distractions from our personal devices, breaking the social contract we have when we are engaged in a conversation with someone.
It seems that drones are all the talk these days. Drones are useful in climate science, space research, energy and environmental research, and surveillance to protect sensitive areas, and they show great potential for growth in technology and applications. Drone technology depends mainly on many branches of engineering, including but not limited to robotics, computer technology, avionics, air and space research, mechanics, fluid mechanics, thermodynamics, heat transfer, and others.1 When energy prices rise and the autonomy requirements increase, managing the power demand of computing devices becomes a major concern.
In this article, I argue that over the next decade or so, we will see the emergence of a multidisciplinary technological convergence toward high-level manufacturing and information processing based around nanotechnologies.