Skip to main content

CF Summit 2018


I just returned from CF Summit 2018 in Boston. It was a great event this year that was even more exciting for Pivotal employees because of our IPO during the event. I had every intention of writing a technology focused post, but after having some time to reflect on the week I decided to take a different route.

After all the sessions were complete and I was reflecting on the large numbers of end-users that I had seen present, I decided to go through the schedule and pick out the names of companies that are leveraging Cloud Foundry in some way and were so passionate about it that they spoke about it at this event.   I might have missed a couple when compiling this list, so if you know of one not on here, it was not intentional.

  • Allstate
  • Humana
  • T-Mobile
  • ZipCar
  • Comcast
  • United States Air Force
  • Scotiabank
  • National Geospatial-Intelligence Agency
  • Royal Bank of Canada
  • Capital One
  • Kaiser Permanente
  • Boeing
  • Expedia
  • Garmin
  • Travelers
  • General Electric
  • Charles Schwab
  • JP Morgan Chase
  • Swisscom
  • HCSC
  • Southwest Airlines
  • Bloomberg
  • Liberty Mutual
  • American Airlines
  • Allianz Deutschland AG














This is an impressive list of companies that had the opportunity to speak at this 2 ½ day event. I believe you could put it up against the list of customer/users speaking at any open-source technology conference and it would rank up at the top. This is a testament to not only the technology, but the incredible work done by the engineers who work on Cloud Foundry every day.

My other takeaway was the consistent focus on developer impact and productivity in the sessions I attended.  Aside from a few nuts and bolts sessions that were targeted at those interested in the infrastructure of a Cloud Foundry implementation, almost all the sessions were aimed squarely at how to leverage technology to make software development frictionless.  From CI/CD to secrets management, it was all there and in a developer-friendly format. That is an impressive thing….it’s extremely tough to get a large contingent of techies together and have them talk about productivity and business outcomes instead of ...say… the guts of the new flux capacitor interface.


I was speaking to an industry analyst and was told the only other conference that focuses on customer outcomes is SpringOne Platform.   SpringOne Platform has lots and lots of training and technology focused sessions, but it also an abundance of customer impact sessions. Want to see what I mean?

Take a look at https://content.pivotal.io/springone-platform-2017 for the videos from last years SpringOne Platform.

The 2018 version is looking even better. Come join us in Washington D.C

You can register at https://2018.event.springoneplatform.io/register and use the code S1P300_Anniversary to get $300 off this week!






Comments

Popular posts from this blog

Isilon HDFS User Access

I recently posted a blog about using my app Mystique to enable you to use HUE (webHDFS) while leveraging Isilon for your HDFS data storage.   I had a few questions about the entire system and decided to also approach this from a different angle.   This angle is more of "Why would you even use WebHDFS and the HUE File Browser when you have Isilon?"   The reality is you really don't need it, because the Isilon platform give you multiple options for working directly with the files that need to be accessed via Hadoop.   Isilon HDFS is implemented as just another API, so the data stored in OneFS can be accessed via NFS, SMB, HTTP, FTP, and HDFS.   This actually open up a lot of possibilities that make the requirements for some of the traditional tools like WebHDFS, and in some cases Flume go away because I can read and write via something like NFS.   For example, one customer is leveraging the NFS functionality to write weblogs directly to the share, then Hadoop can run MapRe…

Project Mystique

REST APIs are becoming ubiquitous these days, because users expect easy and programmatic access to about any piece of technology.  Hadoop is no exception.   Apache Hadoop provides WebHDFS to give access to HDFS via REST API Calls.  You can not only query information, but also upload and download data via the API via simple calls such as:
http://<HOST>:<PORT>/webhdfs/v1/<PATH>?op=GETFILESTATUSOne application that depends on WebHDFS quite heavily is HUE (Hadoop User Interface).   It provides a web-based interface to Hive, Pig, and a File Browser for HDFS and was developed and maintained by Cloudera.  (thanks @templedf of Cloudera for pointing out the oversight)  If you are new to Hadoop, the Hortonworks Sandbox tutorials are all driven via HUE and are a nice introduction to Hadoop functionality and to get a feel for HUE.  HUE is a python based  app designed to improve the overall Hadoop experience.
EMC Corporation has been hard at work not only developing native REST A…

Adding New Machine Types to Pivotal Cloud Foundry via Ops Manager API

Most of my career has been spent on infrastructure and data products, but recently I was asked to refocus slightly and work a bit more with data architectures in the cloud.  That's a pretty broad list of topics, but who doesn't love a challenge.   One of the first things I like to do when working with a new set of technologies is to set them up, break them, set them up again, and break them in a new and novel way.   I am actually pretty talented at breaking things, so this part comes really easy.   
My first adventure was setting up Pivotal Cloud Foundry with Google Compute, and then using the Google Compute Service Broker.   The first step was getting the IaaS setup and configured. I looked around a bit and located a very helpful Terraform repo that was exactly what was needed to jumpstart the process.   Now, the process of setting up Google Compute for PCF was as simple as setting a couple variables and then running terraform apply.   These Terraform scripts are very flexibl…