This week’s Cloud Computing jobs – AWS Big Data

Cloud Academy is always on the lookout for the most promising Cloud Computing opportunities.
Employers: interested in reaching our readers with your job openings? Send us an email.
Because of just how big “The Cloud” is, I am trying to narrow our job searches a little. So this week we will focus on some particularly promising AWS Big Data positions from around the world.
First, though, let’s try and pin down exactly what Big Data is. The following excerpt is from the official AWS whitepaper Big Data Analytics Options on AWS and should give an idea.

As we become a more digital society the amount of data being created and collected is accelerating significantly. The analysis of this ever-growing data set becomes a challenge using traditional analytical tools. Innovation is required to bridge the gap between the amount of data that is being generated and the amount of data that can be analyzed effectively. Big data tools and technologies offer ways to efficiently analyze data to better understand customer preferences, to gain a competitive advantage in the marketplace, and to use as a lever to grow your business. The AWS ecosystem of analytical solutions is specifically designed to handle this growing amount of data and provide insight into ways your business can collect and analyze it.


1. Lead Quality Engineer (Big Data/Analytics, AWS, Scale)

TuneIn – San Francisco SOMA District

Job Description:
As a key member of the Quality Engineering team, you will work directly with business stakeholders to lead and execute a variety of projects on our streaming audio platform and big data services.
You will hold responsibility for full life-cycle management, including requirements analysis, test case planning and design, and execution of manual and automated tests and procedures. You will work side by side the engineering team and coordinate the efforts of testers onsite and out of house.
Responsibilities:

  • Identify overall QA milestones, dependencies, issues, risks, etc. for specific project. Plan and coordinate QA activity accordingly.
  • Define test methods and create test documentation for new or existing projects to ensure necessary test coverage
  • Create test tools that will help test platform components and data flow through the system
  • Maintain effective communication with development, PM and business stakeholders regarding project status and risk assessment.
  • Work with and lead offshore team.

2. Software Development Engineer – Big Data, AWS Elastic MapReduce (EMR)

Amazon – US-WA-Seattle

Job Description:
Want to change the world with Big Data and Analytics? Come join us on the Amazon Web services (AWS) Elastic MapReduce (EMR) team!
EMR is a massively scaled distributed service that gives users fast and predictable Big Data performance with clusters running Hadoop, Hive, Pig, Impala, Spark, Shark and more, with the ability to effortlessly scale up and down as needed. We run millions of customer clusters, enabling processing on vast datasets.
Responsibilities:

  • Keeping your finger on the pulse of the constantly evolving and growing Big Data field
  • Translation of complex functional and technical requirements into detailed architecture and design
  • Delivering systems and features with top-notch quality, on time
  • Stay current on technical knowledge to keep pace with rapidly changing technology, and work with the team in bringing new technologies on board

3. Big Data Engineer – SQL / ETL / AWS / Linux / SAP HANA

Burns Sheehan – London, United Kingdom

Job Description:
If you are a talented, passionate Big Data Engineer then an exciting new role has been created just for you! You’ll join a super impressive Data Analytics Team and play a key role on a number of very high profile projects in state of the art offices offering stunning views of London, an on-site gym and cool chill out area.
Responsibilities:

  • Maintain and scale their data pipeline in the face of explosive growth
  • Own one of many continuous improvement projects to expand the capacity and performance of their platform
  • Support Data Analytics implementation of new tools to increase the robustness of our end-to-end data chain
  • Develop tools for the administration and monitoring of their core data platform

4. Full Stack Javascript Developer – Big Data & Analytics Company

Radley Group – Sydney, Australia

Job Description:
Our client, a big data and analytics company who has just won a big contract with one of the big four banks,  is on the lookout for a javascript developer to work on a number of greenfield enterprise grade projects. They have secured their 3rd round of funding and have more than 30 employees already and are looking to scale-up. based on this new contract It is a great opportunity for the right candidate to get on-board and drive their existing and greenfield projects forward.
The benefits for the successful candidate will be:

  • Silicon Valley style employee share schemes.
  • Young, dynamic team at the forefront of their industry
  • Tech. start up culture and environment
  • Significant amount of support from SME’s in the business and an opportunity to hit the ground running and make this their own.

Therefore, we are on the lookout for a bright and talented full stack javascript developer with a primary focus on the back-end to come on board and join the team for this large delivery that will be rolled out globally.
Responsibilities:

  • a strong developer with at least 3 years experience in similar roles
  • a core member of our engineering team
  • concentrating on improvement and expansion of our API
  • working on our Angular client apps when needed
  • taking ownership of automating and containerisingour hosting
  • prototyping new approaches and testing ideas
  • helping to build an awesome engineering culture

5. Data Engineer – Visualization & Reporting

Babbel – Berlin, DE

Job Description:
We are looking for a Data Engineer (full-time) to start immediately in our office in Berlin-Kreuzberg, Germany.
With millions of regular users and more than 7000 hours of premium content, Babbel.com is growing fast while shaping the future of learning.
You are an experienced software engineer who has built and run cloud-based analytics infrastructure in internet companies before and you want to apply your expertise in the challenging setting of a fast moving company.
In your role, you will help evolving our existing data architecture and infrastructure to further enable data-driven insight.
You will collaborate closely with the technical product owner for Analytics, lead technical architects and the Analytics team to deliver analytics solutions for different stakeholders as well as introduce new technologies in the area of analytics, reporting, monitoring and dashboards and Big Data.
Responsibilities:

  • Manage reporting/dashboard plans for our different internal customers (e.g. product teams, marketing, finance, operations)
  • Interface with engineers, analysts and managers to understand data needs.
  • Design, develop, test, launch new reports and dashboards into production.
  • Provide support to reports and dashboards running in production.
  • Define and manage SLA for all data sets in allocated areas of ownership.
  • Work with data architecture to triage infrastructure issues and drive to resolution.
  • Build data expertise and own data quality in allocated areas of ownership.

Written by

I have been UNIX/Linux System Administrator for the past 15 years and am slowly moving those skills into the AWS Cloud arena. I am passionate about AWS and Cloud Technologies and the exciting future that it promises to bring.

Related Posts

— November 28, 2018

Two New EC2 Instance Types Announced at AWS re:Invent 2018 – Monday Night Live

Let’s look at what benefits these two new EC2 instance types offer and how these two new instances could be of benefit to you. Both of the new instance types are built on the AWS Nitro System. The AWS Nitro System improves the performance of processing in virtualized environments by...

Read more
  • AWS
  • EC2
  • re:Invent 2018
— November 21, 2018

Google Cloud Certification: Preparation and Prerequisites

Google Cloud Platform (GCP) has evolved from being a niche player to a serious competitor to Amazon Web Services and Microsoft Azure. In 2018, research firm Gartner placed Google in the Leaders quadrant in its Magic Quadrant for Cloud Infrastructure as a Service for the first time. In t...

Read more
  • AWS
  • Azure
  • Google Cloud
Khash Nakhostin
— November 13, 2018

Understanding AWS VPC Egress Filtering Methods

Security in AWS is governed by a shared responsibility model where both vendor and subscriber have various operational responsibilities. AWS assumes responsibility for the underlying infrastructure, hardware, virtualization layer, facilities, and staff while the subscriber organization ...

Read more
  • Aviatrix
  • AWS
  • VPC
— November 10, 2018

S3 FTP: Build a Reliable and Inexpensive FTP Server Using Amazon’s S3

Is it possible to create an S3 FTP file backup/transfer solution, minimizing associated file storage and capacity planning administration headache?FTP (File Transfer Protocol) is a fast and convenient way to transfer large files over the Internet. You might, at some point, have conf...

Read more
  • Amazon S3
  • AWS
— October 18, 2018

Microservices Architecture: Advantages and Drawbacks

Microservices are a way of breaking large software projects into loosely coupled modules, which communicate with each other through simple Application Programming Interfaces (APIs).Microservices have become increasingly popular over the past few years. The modular architectural style,...

Read more
  • AWS
  • Microservices
— October 2, 2018

What Are Best Practices for Tagging AWS Resources?

There are many use cases for tags, but what are the best practices for tagging AWS resources? In order for your organization to effectively manage resources (and your monthly AWS bill), you need to implement and adopt a thoughtful tagging strategy that makes sense for your business. The...

Read more
  • AWS
  • cost optimization
— September 26, 2018

How to Optimize Amazon S3 Performance

Amazon S3 is the most common storage options for many organizations, being object storage it is used for a wide variety of data types, from the smallest objects to huge datasets. All in all, Amazon S3 is a great service to store a wide scope of data types in a highly available and resil...

Read more
  • Amazon S3
  • AWS
— September 18, 2018

How to Optimize Cloud Costs with Spot Instances: New on Cloud Academy

One of the main promises of cloud computing is access to nearly endless capacity. However, it doesn’t come cheap. With the introduction of Spot Instances for Amazon Web Services’ Elastic Compute Cloud (AWS EC2) in 2009, spot instances have been a way for major cloud providers to sell sp...

Read more
  • AWS
  • Azure
  • Google Cloud
— August 23, 2018

What are the Benefits of Machine Learning in the Cloud?

A Comparison of Machine Learning Services on AWS, Azure, and Google CloudArtificial intelligence and machine learning are steadily making their way into enterprise applications in areas such as customer support, fraud detection, and business intelligence. There is every reason to beli...

Read more
  • AWS
  • Azure
  • Google Cloud
  • Machine Learning
— August 17, 2018

How to Use AWS CLI

The AWS Command Line Interface (CLI) is for managing your AWS services from a terminal session on your own client, allowing you to control and configure multiple AWS services.So you’ve been using AWS for awhile and finally feel comfortable clicking your way through all the services....

Read more
  • AWS
Albert Qian
— August 9, 2018

AWS Summit Chicago: New AWS Features Announced

Thousands of cloud practitioners descended on Chicago’s McCormick Place West last week to hear the latest updates around Amazon Web Services (AWS). While a typical hot and humid summer made its presence known outside, attendees inside basked in the comfort of air conditioning to hone th...

Read more
  • AWS
  • AWS Summits
— August 8, 2018

From Monolith to Serverless – The Evolving Cloudscape of Compute

Containers can help fragment monoliths into logical, easier to use workloads. The AWS Summit New York was held on July 17 and Cloud Academy sponsored my trip to the event. As someone who covers enterprise cloud technologies and services, the recent Amazon Web Services event was an insig...

Read more
  • AWS
  • AWS Summits
  • Containers
  • DevOps
  • serverless