Javascript

Our use case

It has been close to 3 years since Drupal 8 was released. With lots of amazing new features released in every 6 months we are getting more and more excited to see how Drupal has evolved and improved. With the release of Drupal 8.4, it is starting to use ES6 in core. However, there is not yet a standard practice to write ES6 in custom modules and themes. The ideal solution is for compiled JS to still stay in the original module or theme and make our output JS still be compatible with Drupal 8. Drupal is heavily modular, so Javascript files written for a module should ideally be stored with it. This atomic design pattern allows a module to easily be enabled or uninstalled, and makes our code clean. JS files could be also added in a theme level if they are more related to that theme. In this blog we will share our story on how to configure the project with webpack and other tools to facilitate ES6 in the whole Drupal 8 project including both custom modules and at the theme level.
Picture Wondering what DDD stands for? Well, DDD stands for Developers Developers Developers! (presumably taken from this famous Steve Ballmers on-stage chant) It is an inclusive, non-profit conference for the software community. This year, DDD Melbourne was held on 15th September 2018 at Town Hall in Melbourne CBD. It was a one-day conference which started at 9:00am and concluded at 5:15pm. Personally, I thought the conference was very well-organized and at $79, it was affordable and being held on a Saturday meant I didn’t have to talk a day off work either. Based on what you fancy, there were several talks to choose from. The agenda, which was finalised after attendees voted on the talks, can be found here.   
It was early morning rush hour on Friday, June 22nd, and the Melbourne winter sun shone on my face. I was attending Angular Conf Australia, being held at South Wharf. Arriving with just enough time to spare, I looked around the conference venue and had a quick chat with some of the participants. After a slight delay, the conference kicked-off with a quick thanks to sponsors and supporters. In this post I'm going to give you a rundown of what I saw next.

ns

Native-what?

If you’re anything like myself before embarking on this project then you might wonder the same thing. Everyone has heard of React Native, it’s a popular technology enabling a cross-platform mobile development experience, but not so many have crossed paths with Nativescript, or {N}. Nativescript is a framework with a similar goal to React Native, providing the ability to publish native apps for both major platforms, iOS and Android, whilst only maintaining a single (mostly) JavaScript codebase.  It is available in three flavours: standard, Angular, and Vue.js, allowing developers with experience in the latter two frameworks to use their existing skills whilst building for mobile. Put simply, it’s a JavaScript runtime inside a native app, translating code on the fly into native elements.  In this post, I’m going to tell you about a few of the nice experiences we’ve had recently developing with Nativescript, as well as a few of the not so nice issues that were encountered, and whether we’d use it again.

Some background

When we started using Google BigQuery - almost five years ago now - it didn't have any partitioning functionality built into it.  Heck, queries cost $20 p/TB back then too for goodness' sake!  To compensate for this lack of functionality and to save costs, we had to manually shard our tables using the well known _YYYYMMDD suffix pattern just like everyone else.  This works fine, but it's quite cumbersome, has some hard limits, and your SQL can quickly becomes unruly. Then about a year ago, the BigQuery team released ingestion time partitioning.  This allowed users to partition tables based on the load/arrival time of the data, or by explicitly stating the partition to load the data into (using the $ syntax).  By using the _PARTITIONTIME pseudo-column, users were more easily able to craft their SQL, and save costs by only addressing the necessary partition(s).  It was a major milestone for the BigQuery engineering team, and we were quick to adopt it into our data pipelines.  We rejoiced and gave each other a lot of high-fives.

Intro

Post update: My good friend Lak over at Google has come up with a fifth option! He suggests using Cloud Dataprep to achieve the same. You can read his blog post about that over here. I had thought about using Dataprep, but because it actually spins up a Dataflow job under-the-hood, I decided to omit it from my list. That's because it will take a lot longer to run (the cluster needs to spin up and it issues export and import commands to BigQuery), rather than issuing a query job directly to the BigQuery API. Also, there are extra costs involved with this approach (the query itself, the Dataflow job, and a Dataprep surcharge - ouch!). But, as Lak pointed out, this would be a good solution if you want to transform your data, instead of issuing a pure SQL request. However, I'd argue that can be done directly in SQL too ;-) Not so long ago, I wrote a blog post about how you can use Google Apps Script to schedule BigQuery jobs. You can find that post right here. Go have a read of it now. I promise you'll enjoy it. The post got quite a bit of attention, and I was actually surprised that people actually take the time out to read my drivel. It's clear that BigQuery's popularity is growing fast. I'm seeing more content popping up in my feeds than ever before (mostly from me because that's all I really blog about). However, as awesome as BigQuery is, one glaring gap in its arsenal of weapons is the lack of a built-in job scheduler, or an easy way to do it outside of BigQuery. That said however, I'm pretty sure that the boffins over in Googley-woogley-world are currently working on remedying that - by either adding schedulers to Cloud Functions, or by baking something directly into the BigQuery API itself. Or maybe both? Who knows!

Server-side rendering a React app is a miracle on-par with childbirth and modern air travel.

OK, that opening sentence was a little over-the-top. I apologise to birth mothers and those in the aviation industry.

Let me start again: server-side rendering a React app is...kind of cool.

That said, it can be a little tricky to get started, especially if you're trying to do it with an existing app.

In this post I’ll explain one way you can implement server-side rendering (SSR) for an app that's using  React Router v4 and Redux Thunks.

Along the way we'll discuss the fundamental difference between JavaScript clients and servers, how it forces us to change the way we do routing, and the small "missing-link" that enables us to bridge React Router v4 with Redux thunks.

We'll build up a simple example to demonstrate. I'm going to assume you've got some knowledge of:

  • React
  • Redux
  • React Router v4

However, you are not required to have knowledge of:

  • Childbirth
  • Aeronautics

Let's do this.

Do you recoil in horror at the thought of running yet another mundane SQL script just so a table is automatically rebuilt for you each day in BigQuery? Can you barely remember your name first thing in the morning, let alone remember to click "Run Query" so that your boss gets the latest data refreshed in his fancy Data Studio charts, and then takes all the credit for your hard work? Well, fear not my fellow BigQuery'ians. There's a solution to this madness. It's simple. It's quick. Yes, it's Google Apps Script to the rescue. Disclaimer: all credit for this goes to the one and only Felipe Hoffa. He 'da man!