google cloud dataflow Tag

[caption id="attachment_7618" align="aligncenter" width="1024"]c246283e-2952-41db-a64f-a8fb9f186c6c-original All the GDEs posing at the Googleplex[/caption] A few months back, Shine's Pablo Caif and Graham Polley were welcomed into the Google Developer Expert (GDE) program as a result of their recent work at Telstra. The projects they are working on consist of building bleeding edge big data solutions using tools like BigQuery and Cloud Dataflow on the Google Cloud Platform (GCP). You can read all about that here. GDE acceptance comes with many benefits and privileges, one of which is a yearly trip to a private summit at a different location each year. With Google footing the bill, they bring all the GDEs (around 250 currently) from around the globe for, let's admit it, a complete Google geek-out fest for 2 days! This year the summit was at the Googleplex in Mountain View. Needless to say, Pablo and Graham were chomping at the bit to go. However, in addition to the summit, Google invited them to fly out prior to actual summit itself. They had lined up a few other things especially for the guys. So this was no ordinary trip. Lucky buggers! We asked both guys to give their individual feedback on the trip, and here's what they had to say about it. Read on if you want to hear about how the guys spent six days hanging out with Google in America.
contrailscience.com_skitch_skitched_20130315_131709 One of the projects that I'm currently working on is developing a solution whereby millions of rows per hour are streamed real-time into Google BigQuery. This data is then available for immediate analysis by the business. The business likes this. It's an extremely interesting, yet challenging project. And we are always looking for ways of improving our streaming infrastructure. As I explained in a previous blog post, the data/rows that we stream to BigQuery are ad-impressions, which are generated by an ad-server (Google DFP). This was a great accomplishment in its own right, especially after optimising our architecture and adding Redis into the mix. Using Redis added robustness, and stability to our infrastructure.  But – there is always a but – we still need to denormalise the data before analysing it. In this blog post I'll talk about how you can use Google Cloud Pub/Sub to denormalize your data in real-time before performing analysis on it.
multiple-seats

My work commute

My commute to and from work on the train is on average 17 minutes. It's the usual uneventful affair, where the majority of people pass the time by surfing their mobile devices, catching a few Zs, or by reading a book. I'm one of those people who like to check in with family & friends on my phone, and see what they have been up to back home in Europe, while I've been snug as a bug in my bed. Stay with me here folks. But aside from getting up to speed with the latest events from back home, I also like to catch up on the latest tech news, and in particular what's been happening in the rapidly evolving cloud area. And this week, one news item in my AppyGeek feed immediately jumped off the screen at me. Google have launched yet another game-changing product into their cloud platform big data suite. It's called Cloud Dataproc.