NetApp Eyes Opportunities in Health Care Data Storage
By Kevin Davies
February 1, 2013 | Whatever happened to NetApp? When Bio-IT World launched in 2002, NetApp was one of the big names in big data storage in the biotech and life sciences arena. But over the past decade, while brand names such as Isilon, EMC, BlueArc, Quantum, Panasas, DDN and many others have cashed in on the data deluge, NetApp kept at best a very low profile in the space. That is not to say that it was not in use or the technology does not have its supporters: on the contrary, many data center managers could point to trusted NetApp installations. NetApp storage is used at Genentech and several other major biotech firms headquartered in California and beyond. For some, however, it was less of a pain to integrate their old NetApp systems than replace them with new.
But there are strong signs that NetApp is turning things around. For example, the company has introduced flash-based storage solutions (such as FlashCache and SSD based architectures) to meet extreme performance requirements. These technologies have also been integrated with NetApp’s Virtual Storage Tiering solution in order to help customers leverage flash to improve performance while still utilizing cost effective storage.
Bio-IT World reached out to Dave Nesvisky, who joined NetApp in September 2010 as senior director of health care sales, for an update on NetApp’s new technology and rekindled interest in the health care and life sciences sector.
Bio-IT World: Dave, what’s been going on at NetApp over the past few years as the life sciences has been swamped with data?
Dave Nesvisky: There's been significant change over the past couple of years and it continues to evolve. Health care's obviously a very broad area and includes a lot of different segments—you’ve got providers, research, regulatory, device manufacturers, health insurance, and distributors. There's almost no limit to what you could include in the health care segment.
When I joined NetApp a couple of years ago, NetApp had several thousand customers in health care. Customers were using our products for the kind of things that every NetApp customer uses our products for: Exchange and SharePoint and virtualized data centers, general IT, not anything specific to health care. But many of those clients, especially hospitals, clinics, providers, were very interested in solving bigger problems. They were enjoying the benefits that NetApp brings in terms of storage efficiency and total cost of ownership and operational efficiency. They said, ‘you're solving that problem for us at a small level because the data you're managing represents a fraction of our overall data problem. Our bigger data storage expense is around diagnostic imaging, electronic medical records. Can you help us with that?’
A couple of years ago, NetApp was not fully prepared to help our customers in that market… We did not necessarily have the skill set around the applications that health care customers were running. My first step in joining the company was to start building a team—bring in people that had come from big application companies that serve the provider market—companies like McKesson and Siemens—and brought in a former health system CIO to help us better understand the market. We’re now in a much better position to support our customers around their bigger data problems.
Last year, we pulled together the payers and providers and a select number of software vendors and created the health care vertical that I lead today. That includes all stripes of providers—for profit, not-for-profit, academic medical centers, all that falls under our domain. Pharma and biotech is largely run out of a dedicated district that's part of our Americas group, not part of the health care group today. As I said, different companies define health care differently. We've defined it around payers, providers, and some ISVs… It remains to be seen what's going to make the most sense for NetApp, whether the existing structure is good, or whether it should have an expanded definition. But that's our definition today.
What are the shifts in medicine, the impetus driving this growth in volume? And how is NetApp meeting that demand?
Nesvisky: One element is in the basic research itself. They're mapping more and more genomes and it's obviously driving much greater data requirements within that industry itself. But we're seeing effects on the rest of health care… Today medicine is delivered reactively and episodically. You get sick. You go to the doctor. They treat you. That's a very expensive way to treat people.
The push under the Affordable Care Act and ACOs (Accountable Care Organizations) is more in preventive medicine—the promotion of wellness rather than treating sickness. If you've got people with asthma or diabetes or high blood pressure, it's really about proactively getting these people into programs to maintain their wellness so that they don't get into the health care system any deeper than they're already in.
Where the future and the alignment is with bio-IT is predictive medicine—the opportunity to look at somebody's genetic makeup and be able to predict with some level of accuracy you have the markers that indicate that in 20 years you're likely to get these things. What can we do now? And then, in line with the pharma companies that are starting to be able to create custom-made pharmaceuticals for individuals, to treat them more effectively and target their disease more accurately. That's where the convergence is…
What is NetApp doing in the space? We acquired a company called Engenio from LSI a year or so ago to create a cost-effective and dense storage platform ideal for high throughput workloads, for object content repositories, for unstructured data, and for other use cases where you've got either high volumes or very large databases or very large object containers.
Actually, that was a part of the portfolio that we didn't previously have. We had a broad product portfolio that could essentially do that function, but this is a platform that took it to the next level. It had very high throughput and very dense storage—obviously when you talk about very large data sets there are physical constraints to the data center before you have to expand it, so you want to be able to pack as much storage into the smallest possible space. We've been very successful with that E-Series product. It's a product that we work into the space as well as it's a very large OEM product for us.
What was it about that technology that particularly appealed to NetApp?
Nesvisky: The footprint density. It's a very dense storage platform and it had very high throughput for use cases like full motion video where typical SAN or NAS was not built to handle that effectively. It's finding its way into a lot of different application areas. From the health care perspective, the two most interesting things are big data analytics and also very large object content repositories in the multiple petabyte range.
In terms of the actual data that you're supplying solutions for, what are you seeing?
Nesvisky: There may be a future application telemedicine with video and image data. But that's a little bit of a future state for us, not top-of-mind right now. Another emerging area is digital pathology. Today, the typical imaging modalities that you see—X-ray, CT, PET, MRI—as those modalities become more powerful and the images are more refined, they're requiring more storage themselves. 3-D mammography was approved by the FDA last year. It uses almost ten times more storage per image than 2-D. The typical modalities are taking up a tremendous amount of storage. In digital pathology, some of these things can run into a terabyte per study, which is an incredible amount of storage.
But we also see, on the genomics side, it's taking up a lot of space and it requires high bandwidth. We have clients who moved to NetApp because they're getting a lot of efficiency out of a capability in our FAS product line called flexible volumes, or FlexVol. That allows a lot of researchers to be allocated a lot of storage as far, say several terabytes. The administrator is really only carving up a smaller amount, but it gives the appearance to the user that they have all they need.
In a typical environment without NetApp, you would have to physically provision that amount of storage to each user. If ten researchers each needs 10 terabytes, you would physically have to provision 100 terabytes to those people, even though those guys might only be using one or two terabytes at any given time. With flexible volumes, you can tell them that they have access to ten but you know they're not going to use that. You're able to physically provision a lot less, which saves a lot on the storage.
The other part that people are finding with NetApp is it's just easier to manage. We consistently find that our administrators can manage a lot more volumes of data, a lot larger arrays with a lot fewer people.
Are there a couple of installations in the last 12 months in your health care arena that you can point to as good examples?
Nesvisky: One that comes to mind is the Duke Institute for Genomic Sciences, which is a NetApp FAS customer. They were getting more and more grants and research projects and it was stressing their systems because they had more and more researchers on it. The way they were adding people and trying to manage things, it was just runaway data growth and they needed a new platform that was more efficient, that could work into their environment.
The two things they found with NetApp is NetApp works very well in a virtualized environment. And the way of doing it before is you'd get a grant and you'd stand up a new system so you've got tons and tons of really underutilized servers and storage. And this is not a unique thing to genomics… They made an architecture decision to move to NetApp in a heavily virtualized environment and it gave them several tremendous advantages. It allowed them to reduce the footprint on the floor, which enabled them to extend the life of how long they could stay in their data center—if you can compress into a smaller footprint, that means your data center's got more room to grow over time. That was really good. With fewer physical devices running, you can run it with a much more efficient staff… They were able to continue with the current staff and handle bigger workloads efficiently. And they were getting tremendous throughput from the system. Some really good benefits from making a move to NetApp.
What's your competitive advantage these days?
Nesvisky: There are a couple of areas. Clearly there are very successful top tier players in the space, but the features of NetApp software, the flexible volume, the ability to provision virtually a lot more storage to the users than they had to physically provision, was very efficient to them, and the ease of management compared to other solutions.
Every other vendor tends to offer a portfolio of storage solutions—a particular solution for backup, another for production. And they have families of systems so when you outgrow one of them you have to forklift upgrade to the next bigger series of equipment and it has a different operating system. And so you've got to do a data migration, you've got to literally physically remove the system that was in there, put in the new system, migrate the data, retrain the staff, all that. And that comes into account.
When people assess the long-term impact of their storage decision, NetApp runs one operating system. We have an ‘agile data infrastructure.’ This is important: our agile data infrastructure means that from a single operating environment, Data ONTAP, we can offer people non-disruptive operation, which means that literally any upgrade of systems, software, adding equipment, retiring disk out of warranty or out of service or for whatever reason, anything you need to do in the maintenance of an array is done non-disruptively. You don't have to schedule downtime, which is a huge advantage. Nobody wants to schedule downtime!
We have non-disruptive operation. We have intelligence which means that we can put—we have different types, different performance profiles of disks, so we can put the data where it makes the most sense for the performance you need… In the agile data infrastructure you can build into the multiple tens of petabytes in single volumes. If you have to store large volumes of genomic data, you're really never going to run out of steam.
The agile data infrastructure is something unique that no other company can offer. They all offer a family of products that require you to literally retire one, forklift it out, migrate the data. It's an expensive, complex process that's time consuming and costly. We eliminate that. And when people recognize what NetApp is doing, it's absolutely revolutionary. You can literally build a storage architecture now with us with a single operating environment that goes from the smallest array, whatever you want to start with is just some very small system, to scale literally infinitely and without ever having to forklift anything out. That’s a big advantage.
What other innovative and exciting developments do you foresee?
Nesvisky: Wow, a lot of things! For me, on the health care part of the business, the future is really around analytics, whether it's to pave the way for predictive medicine or manage populations of people. I think the Hadoop/E-Series combination is going to be very powerful.
There are a lot of companies in the space taking a lot of interest in how to go about doing analytics for health care in various areas. Some of them are taking very broad approaches, some narrow approaches. Being able to do analytics in hospitals around the outbreak of, say, sepsis; they want to track that. Sepsis is very expensive to a hospital… Analytics around predicting is somebody likely to get that based on or are they showing the indications, can we treat it early before it fully evolves? That's a big one for us.
We're seeing more private clouds, organizations operating clouds on behalf of the rest of their organization or other organizations that they're affiliated with. We are also working with some public cloud providers also that have some great solutions out there.
Aren’t there fundamental issues with putting patient data in the public cloud?
Nesvisky: Once you explain how the system is architected, it's really not an issue. Frankly, a professionally managed, well-architected cloud data center, the patient information is much more secure than paper files laying around in a hospital. Once people understand how the data are encrypted at rest, in motion, how the physical environment is secured, that really becomes a non-issue.
What challenges do you face in finding new customers?
Nesvisky: As you might imagine, health care is a fairly conservative business in terms of their decisions because they're entrusted with protecting patients' lives. And so, our biggest challenge is just the status quo; hey, we've always bought from these guys. Why would we change? We just need to be in front of people.
One of my favorite quotes is from Woody Allen: “Eighty percent of success is showing up.” When we get our opportunity to present, people get very comfortable with us. We win our share of business. I think we have an absolutely superior solution for our industry… this vertical is a very new venture for NetApp. We just have to tell our story and effectively message and let them know what we have. Our biggest challenge is just really inertia.