With the explosive growth of the cloud, it’s imperative that IT leaders within state and local government and education (SLED) organizations (and their municipalities) find success on their cloud journey. The public cloud offers dynamic and scalable services, as well as the potential to support remote workers and reduce costs. That said, it also has hidden pitfalls that may be missed until budgets and time have already been expended. 

Hyper-scale infrastructure can solve many SLED-specific customer needs, but doing so requires using cloud-native APIs (application programming interfaces) and architecting for application-level resiliency.  

And what happens when moving existing applications?  Migrating applications to the cloud can prove challenging. Traditionally, many of the applications that have been used by SLED today would require costly re-development to utilize  cloud-native APIs. 

These existing applications represent the next wave of cloud adoption for SLED organizations.  There is an easier path to the cloud today without significant redevelopment: taking advantage of cloud-native services such as artificial intelligence, machine learning, Internet of Things (IoT) and Blockchain to “wrap” migrated apps (or those stranded on-prem), with new functionality.

File has been the de facto data for applications for more than thirty years. File-based storage has APIs, we just don't think of it in that way. Workflows and services use files and their API’s every day, here are a few examples: NFS for Unix/Linux and CIFS/SMB for Windows/MAC. File APIs offer a rich set of metadata, security, and compatibility and, most importantly, they do this natively from any platform - they all were built with network file sharing in their core operating systems years ago. 

What SLED IT leaders need is a better way – one which enables organizations  to leverage the power of the cloud provider’s object and block services while leveraging file at-scale for their data needs across hybrid environments. A tool like this would give legacy applications the ability to utilize the full resources of the cloud without the cost of re-writing applications and without the option of failure should a rewrite not be possible. 

Public cloud providers offer basic support for file services but lack the features and performance many SLED organizations require today in their existing systems. The initial design of many public clouds made it so applications could burst into the cloud through a third party tool, such that a working set of data could be copied into the EFS service. The workload could then be executed using the existing file API (NFS in this case) into the public cloud, and then the results could be pulled back out of the public cloud for analysis. 

There is a level of sophistication required by organizations to orchestrate data into the public cloud that remains a barrier. This solution still does not provide application teams with the ultimate goal of leveraging native cloud tools with their legacy data.  For research universities, where compute resources are critical to process data, GCP and AWS have built data centers to eliminate data orchestration with low-latency private links to public cloud utility compute, connected directly to the research facilities existing data management systems. 

There is a need for persistent petabyte-scale, file-based solutions in the public cloud to bridge the needs of existing customer applications and workflows, that offer the agility and scalability of public cloud services.

Qumulo provides software for file and data services across public and private clouds at petabyte-scale, with data services that meet the demands of state and local government, municipalities, education, and transportation. This means SLED organizations can harness the power, efficiency and flexibility offered by the public cloud’s infrastructure and platform as-a-service models through Qumulo’s API and leverage their data no matter where it resides. 

This also makes it easier for these organizations to leverage the cloud because they no longer have to re-architect their apps when deciding to move them to the public cloud. If a user chooses to leave an app on-prem, but move the data to the public cloud (or vice-versa), Qumulo’s hybrid file software can connect the data seamlessly, along with cloud provider services the data might be using, such as AI or IoT platforms. Qumulo provides users with a robust data analytics engine and dashboard for full visibility over what is happening with their file data in real time. That data can be pulled into other ISV-provided data services, such as Splunk or Databricks. Qumulo can also be placed into other services such as ServiceNow for easier integration into DevOps deployment workflows.

File data of all types and sizes can now be moved easily to the public cloud, or to a mixed private and public cloud environment. With Qumulo, state and local governments and education organizations are no longer tethered to hardware limitations. They now have the ability to scale their infrastructures up or down instantly in the public cloud and build and deliver  the apps and services which matter most to their end users and constituents. 

For more information please contact Donald Schiltz dschiltz@qumulo.com