You will with the client to design and implement a data pipeline to get user data from a public-facing service and store it in a centralised Big Query data warehouse. This involves migrating schemas into Big Query.
Other key responsibilities:
Design, develop, test and support data collection, data integration and ETL applications
Maintain and apply up to date, specialist knowledge of database concepts including unstructured, NoSQL and "big data" platforms
Design ETL frameworks and standards for specific ETL programmes and projects
Required Knowledge, Skills, and Abilities
Minimum 3 years' experience implementing multiple data / ETL projects Experience working in agile environments Experience working with cloud environments Big Query experience Experience using an appropriate stack, such as Python, to migrate data Ruby/ Ruby on Rails experience