Your applications are unique; so why shouldn’t you be able to have a choice in where your various services run? In the cognitive business era you’ll be able to have a choice in platforms, runtimes, and deployment models.
The technology leaders of tomorrow will have to architect for the dynamic needs of economics, data access, workloads, machine learning, compliance and performance. This means that they’ll have to build with a purposeful and strategic architecture of all elements.
They’ll have to focus on innovation, because tomorrow’s technology will be different than todays—more data focused and mobile intensive. To get there will require purposeful architectures that are adaptable to inevitable changes.
One-size-fits-all: Not a fit for the Cognitive Business Era
Building with purposeful architectures requires a clear infrastructure strategy for both the known and the unknown. Building for the unknown may sound tricky, but it’s a requirement in an age where technology is constantly changing.
Future innovations are out there, and if your architecture isn’t ready to ensure perpetual motion then you risk falling behind. Today’s connected economy requires both open technology [hyperlink to previous blog with anchor text “open technology”] and choice for optimization in order to accelerate technology breakthroughs.
IBM: Building with Collaborative Innovation
In the last blog we talked about how Open Technology was going to be a driver in the Cognitive Business Era. What is Open Technology though if we’re using it in closed ways. IBM is working to make sure that you have a choice to run your services where they can be best optimized.
If that means working to combine the best of IBM open technologies with the best of another vendors’ open technologies, then so be it. They took just such an approach by working closely with NVDIA on a project for the U.S. Department of Energy.
Choice for Optimization – IBM Success Story
US Department of Energy
The US Department of Energy knows about Big Data environments. They know that they’re different and that they require new architecture that embeds compute power everywhere. This is at the core of IBM’s move toward data centric design.
Current approaches of data repeatedly moving back and forth from storage to processor becomes unsustainable because of the significant time and energy that these massive and frequent data moves entail. Data centric design allows for speed. Speed allows for the convergence of analytics, modeling, visualization, and simulation. The old approach wasn’t going to work for the DOE’s needs. They needed a system that was optimized for their unique needs.
IBM and NVDIA to the Rescue
The NVLink interconnect technology will enable CPUs and GPUs to exchange data five to 12 times faster than they can today.
This optimized solution will help the Department of Energy’s developers more easily accelerate applications with GPU accelerators by incrementally moving parts of the application and data to the GPU. Check out the press release for more details on how IBM and the DOE are working together to optimize for Big Data.
Are you Ready to Build with Collaborative Innovation?
Let’s be honest, we know that the department of Energy’s supercomputing systems at the Lawrence Livermore and Oak Ridge National Laboratories are probably bigger than what you’re working with. But the goal of the Cognitive Business era is to bring technology like this to organizations large and small.
This is a bold approach, one where we’re asking you to reevaluate the way you do business. We’re asking you to put technology at the forefront and make it a driver of business innovation.
Let’s get started with a conversation with one of our cognitive business experts to start to evaluate your organization’s cognitive business journey.