The following is an interview with Dr. Greg Gardner, Chief Architect, Government and Defense Solutions at NetApp. To learn more about how your agency can excel with big data, be sure to check out our guide: The Big Data Playbook for Government.
With government agencies creating more data than ever before, many organizations are leveraging emerging technology to create new kinds of data infrastructures. GovLoop sat down with Dr. Greg Gardner, Chief Architect, Government and Defense Solutions at NetApp, to discuss new data platforms for cloud, mobile and big data. These solutions can enable the public sector to make sense of the huge amounts of information being created today.
“We are seeing a combination of on-premise and cloud computing in an increasing number of organizations to address their data needs,” said Dr. Gardner. “This combination of on-premise and cloud computing is what we call a hybrid IT infrastructure – where some IT is in the cloud and data is stored primarily on-premise.”
NetApp’s distribution partner, Arrow ECS, enables agencies to manage their hybrid cloud environment from a single pane of glass with a robust portfolio of cloud services and products and a management platform called ArrowSphere.
“An agency’s IT infrastructure has to be replicated and made available to cloud computing capabilities to create one common data infrastructure for compute,” said Dr. Gardner. “So regardless of whether you do compute on-premise or in the cloud as a service, people have access to the same data and information.”
The hybrid IT infrastructure creates a data-centric model, which is essential for the success of a big data program. The data-centric model replicates data, so that regardless of location, people are accessing the same data, whether it is on-premise or in a cloud service. To create this kind of model, many organizations are looking to build a universal data platform, providing access to information regardless of where data is hosted. In creating this universal data platform, there are some common features that must be included to ensure a consistent service quality across all cloud and data center assets.
These features usually include secure multitenancy – the ability to host multiple workloads and data belonging to multiple organizations or functions all at the same level of classification, without degrading any services or data boundaries.
Another element is the ability to pool virtual resources, the ability to abstract hardware resources, irrespective of their location or types, and provide efficient data transport and management features.
For government agencies, it is essential when moving and storing data that information is immediately available whether an employee is operating in a cloud infrastructure or an IT on-premise infrastructure.
“If you just have all the data in one place, you are limited as to how you can use that data,” said Dr. Gardner. “We talk about data like it is roaming charges on a cell phone plan; you can store it very cheaply for pennies for gigabyte per month, but if you go to move the data, it gets very expensive. A universal data infrastructure makes data available regardless of when, where, or how people are using it. We believe that is really key.”
In addition to this hybrid model, agencies are also exploring the benefits of a converged infrastructure. A converged infrastructure refers to a package of compute, routing, switching, storage, and virtualization so that all these functions are provided together in one package, ideally supported by one agent.
For instance, a converged infrastructure will include varying amounts of compute or storage, whatever is required to meet the mission at hand. All of these are preconfigured packages with one point of contact for help desk services. Having server, switching, compute, storage and virtualization in one package makes the infrastructure easier to manage, controls costs, and enables scale up or out to meet user demands.
Dr. Gardner hypothesized a typical use case that shows the power of a converged infrastructure and leveraging a hybrid IT model. Take for example a county government with an aging data center that needs to upgrade its infrastructure. The county has legacy IT systems, a limited workforce, budget limitations and regulatory requirements. Those factors combine to constrain the way they do business, and create challenges to innovation in service delivery. To work around these obstacles, the agency could maintain control of their data on-premise, replicate the data to a co-location facility and then, using high bandwidth pipes, expose it to the cloud. In doing so, the county could leverage low-cost, competitive compute capabilities from a number of vendors while maintaining control of their data. The cost savings and significantly increased capabilities of this approach offer real advantages to the county.
“There are certainly hurdles in security, data access and management and fiscal constraints in these approaches,” added Dr. Gardner. “But we believe the challenges can be overcome with a hybrid IT approach that embraces cloud computing as well as on-premise-based data management. We strongly believe the idea of converged infrastructure and a universal data platform. This model can be realized with the appropriate architecture and working with partners like Arrow to get public and private IT organizations that kind of solution.”
For more information, be sure to check out our guide: The Big Data Playbook for Government.