Datrium recently published some crazy benchmarks on their open converged platform. The benchmark created an interesting news cycle along with questions. The first question? What is Open Converged? We know about converged and hyper-converged systems. What about these benchmarks? What do they mean? I'm looking to get answers from Datrium's executives this week as part of the vBrownBag Build Day.
There are no longer any major backup software or storage hardware companies. Everyone one of these companies has repositioned their messaging to serving or protecting data assets. What does this mean for the enterprise? It's a reaction to the change of focus. Big Data is a buzzword.Read More
So, you missed the webinar on serverless? Or you participated and you wanted to review the content. I created a 23-min version of the 45-min webinar. A reminder, serverless has servers. The primary difference is that the context switches to developers. In serverless platforms such as Functions as a Service (FaaS) or Platform as a Service (PaaS), it's about the developer's perspective. If the developer only needs to concern themselves with the code, then by definition it's serverless. If the developers must worry about containers and container orchestration then it's no longer a serverless platform.
What about operations? Well watch the webinar and learn more about the operator's view.Read More
First off, there are servers in serverless computing. However, the construct of servers is a throwback to a time when the only way to develop distributed applications was to focus on the server as the base component of the infrastructure. In the server-centric model of distributed systems, developers needed to understand their CPU, Memory and OS requirements. An argument could be made that coders only need to concern themselves with code. That's the point of serverless.Read More
I’ve worked enterprise IT for far too long to fall into the “privacy at work is king” camp. If you use your organization's resources, then you may give up the right to privacy at work. I’ve implemented data loss protection (DLP) software for a 12,000-user U.S. Federal agency. Users don’t like it. It impacts system performance, and automatic encryption of removable devices makes data sharing more difficult. However, the worse scenario is having an Equifax like headline hit the news. Such breaches are why encryption breaking technology for snooping on employees remains interesting.Read More
is a business upon itself. The larger the team, the larger the effort to coordinate the efforts within and between organizational units. When an organization talks about flattening the org chart, it's a result of out of control overhead. Computers systems are not immune from this impact.Read More
Google's Kelsey Hightower joins the podcast for a reality check for enterprise IT. Keith and Mark ask the question, "Can enterprises adapt" to the quickly changing technology landscape? Do organizations have the budget and aptitude for navigating the disruption. Kelsey gives a pretty blunt assessment. Kelsey claims debates around issues such as CI/CD are now over. Kelsey shares his vision of the relation containers, the cloud, and business value.
The Equifax hack raised a basic question. Don’t most Fortune 500 companies have some type of data loss protection (DLP) system in place to prevent such breaches. Or at least identify the breach before too much damage is done? You’d think that is the case after high-profile consumer data hacks such as Target occurred. The technologies to identify these intrusions remains ample. The operations discipline to utilize them remains immature. Next up on our Tech Field Day 15 previews is network virtualization and security company Ixia.Read More
The Tech Field Day (TFD) presenter up next in my series of TFD 15 previews is DataCore. DataCore is a virtual SAN or hyperconverged solution. You may ask, why do we need yet another software-defined storage solution. VMware has come out with VSAN, aren’t we done? The simple fact is that there’s going to be opportunities for virtual SAN companies as long as there’s performance to be eked out of the underlying storage infrastructure. Or there are cost savings.
As a reminder, Tech Field Day is an independent influencer event sponsored by 7 to 8 companies. Stephen Foskett flies in 12-delegates to listen to the presenters discuss the business and technical details of their product and services. It’s a great opportunity to deep dive into the technology. Outside of travel, meals and vendor swag, delegates are not compensated for the event or are required to write about the event or companies. Apparently, it makes sense to do so when appropriate.
Since this is Datacore’s first TFD event, it helps to share some background about the company. DataCore is far from a startup. The company began before the first Internet Bubble Crash in 1998. It’s about as old as my IT career. The company has about $100M in funding with the last round completed in 2008. With the recent funding round almost ten years ago, I assume that the underlying business remains stable. The company may experience challenges breaking into the next layer of success. DataCore has watched many competitors start up and surpass them in the market. With that said, I’m interested to see where they differentiate between market leaders vSAN, Nutanix and HPE SimpliVity.
Datacore’s primary strengths seem to be performance gains from parallel I/O. There are some potential advantages to increased I/O via virtual SAN platforms. The workloads that come to mind are analytic or business intelligence (BI) platforms. Companies have looked to in-memory databases to spend the performance of BI and analytics workloads. The challenge? Moving an existing relational database to an in-memory platform is expensive and involves risk.
The poster-child for such a transformation is SAP HANA. Anytime a customer considers a HANA migration, there’s a salesperson at a consulting agency that gets his wings. It’s an expensive transformation from the infrastructure to the application. There’s opportunity if solutions such as DataCore promise significant gains in I/O for legacy BI without the massive investment in application upgrades.
Analytics and BI are just one example. I’m sure there is high-performance compute (HPC) and general-purpose use cases. Combining something like this platform which promises increased I/O bandwidth with faster storage underlays powered by new memory technologies like Intel Optane at least make for an interest TFD presentation.
I’m excited to hear from DataCore. It’s yet another HCI vendor but one that seems to have the secret sauce to a long-term business to date. I’m interested to hear how it lasts another 20-years given the increased competition in virtual SAN and software-defined storage.
We’ve heard the term zero-trust before. VMware has used it to describe the micro-segmentation that’s done using NSX. The idea in network micro-segmentation assumes no trust between systems based on location within the physical or logical network. Skyport Systems takes the concept to the extreme. In Skyports architecture, the network, application, computing, and user have no shared level of trust, other than zero-trust.Read More
Secondary data has seen a boom in investment. Cohesity, Rubrik, and Druva have all each seen recent investments of $80M or more. Either there’s incredible product demand or investors are drunk on the market segment. TFD15 is an opportunity for Actifio to share its updated strategy after a few years of lessons learned.Read More
Scale Computing is now a seasoned veteran at presenting to the TFD audience. However, this is a new market for the company. Presenters will need to talk beyond speeds and feeds. Unlike competitors such as Nutanix, Scale Computing doesn’t offer support for the VMware ecosystem.Read More
I've talked about VMware Cloud (VMC) on AWS for a few weeks since the GA release during VMworld 2017. The solution is unique in that Amazon doesn't modify its data center design for any partner. VMC on AWS required the two companies to work together to make two very different data center designs work. The solution raises the question around lift and shift as a cloud migration strategy.Read More
The hotel industry represents an entire vertical integration of technology and the consumer experience. However, Airbnb is in the process of disrupting the entire industry. Digital transformation is only part of the equation.Read More
We had Intel on the CTO Advisor Podcast discussing Optane Storage and what it means to the data center and applications. During TechField Day Extra at VMworld 2017, we were reminded the physical memory technology is only as good as the protocols carrying the data. Kingston memory presented their vision for NVMe based SSD along side startup Liqid.Read More
The NSX team was describing how application flows over the virtual network. The team described the interchanges within the hypervisor. The interchange system was called a switch. While not new, the concept threw me for a curve.Read More
Coming off of DellEMC World, HPE Discover and Interop, I'm convinced that legacy IT infrastructure companies need a reset. Today, AWS and Google are having conversations directly with the consumers of technology. It's ironic that the theme of digital transformation is to remove friction in a business relationship. Shopping online is disrupts brick and mortar because online shopping removes friction.Read More
Are all Software-defined WAN (SD-WAN) solutions alike? With over 40 vendors in this space, I’d say that it’s likely not. Talking with The PacketPushers’ Greg Ferro on Twitter, he believes that many of these vendors will survive. While the space is fragmented, Ferro believes there’s room enough and market opportunity for most of the existing players to make a going concern of the SD-WAN business. It’s the question, of what makes one SD-WAN vendor different from another that I approach the follow-up to the TELoIP presentation at Network Field Day 15 (NFD15).Read More