Additional Blogs by Members
Showing results for 
Search instead for 
Did you mean: 
Former Member
0 Kudos

Remember the first time you used the Internet?

Today, we connect online far more than in person. And we generate reams of personal data as we go, via email, blogs, Twitter.

Add to that a billion mobile phones and new sensors to connect cars, refrigerators, TVs, even houses. We are generating so much data via the Internet that it is almost as much as all of the words ever spoken.   

How we manage that data efficiently via cloud computing was the focus of the National Institute of Standards (NIST)
Cloud and Big Data Forum. Technologists gathered to explore how big data analytics is emerging as the “killer app” for the cloud.

Data is now driving scientific discovery, converging the physical and life sciences. Researchers no longer start with a hypothesis. They examine the data first, then hypothesize what it reveals.

But answering science’s largest questions will require interconnected clouds that operate on data simultaneously or move partial results back and forth.

And in future, 80% of cloud workloads will come from other clouds. So interoperability standards will be key.

Interconnected clouds might be needed to predict global disasters and manage response, simulate astrophysics models or deliver specialized medical care remotely.

According to keynote speaker Vint Cerf, founding father of the Internet and Google’s Chief Internet Evangelist, interconnected clouds are in the “same state of infancy as the Internet was in the 1970s.”

Interconnected clouds are an aspirational goal in NIST’s
definition of the hybrid cloud. As one of the earliest Internet users, NIST has been researching cybersecurity, identity management and other topics that enable hybrid cloud.

The hybrid cloud includes “two or more distinct cloud infrastructures bound together by standardized or proprietary technology to enable data and application portability (e.g. cloud bursting for load balancing between clouds).”

In talking with Tim Grance, NIST co-author of the definition and NIST security expert Lee Badger, what began as an effort with co-author Peter Mell to understand cloud computing for themselves became a universal description of what cloud is – and what to do with it.

As NIST consulted with government and industry, the definition expanded. Its modularity in matching services to deployment models is meant to spark new business models that can be implemented quickly.

NIST’s ultimate vision for the cloud is to take a workload and move it anywhere within one’s cost and security requirements. This would drive powerful efficiency in today’s IT infrastructure and foster healthy market competition. As Tim noted, market forces will further interpret the definition over time.

While industry has first gravitated towards public cloud services, government has gravitated towards private cloud services. Now each is exploring the other XaaS/deployment combinations as well.

Today, NIST’s flexible definition of cloud is embraced all over the world. Commercial public sectors alike use it to assess which cloud solutions might best enhance operations or drive business growth. 

The value of NIST’s common definition is how it will lead to metrics and standards that harmonize global cloud connectivity. As Vint Cerf summarized in his keynote, if we “push the boundaries of cloud computing … the opportunities are enormous … and there for all of us.”

Follow @JacquelnVanacek for how cloud can reinvent government and the economy.