Revolutionising the shopping experience through data-driven excellence.
We use leading technologies to minimise risk
We know that it’s important that you can count on us and what we deliver, so we use the latest industry-leading cloud technologies and platforms for everything we do. Naturally, our team is fully trained and certified.
Adaptiv is the leader in data and integration consulting and development in NZ, delivering innovative solutions to meet our clients’ needs through cloud-based services. We employ the best of the best in the industry which makes it easy to support our staff in gaining the skills necessary to become data and integration experts.
No two days are the same, and while we provide complex solutions for some impressive clients, we know when to switch off and have fun. With no office politics, an excellent company culture and a modern rooftop office (BBQ anyone?), this kind of opportunity does not come around often.
The GCP (Google Cloud Platform) Data Engineer position is a critical role as it provides data consulting across the Company’s partner businesses (clients). The position is responsible for assisting with the design, implementation, review, and updates of the Company’s IT solutions for clients. This is a varied position with exposure to data technologies both on-premises and on GCP.
This position will be required to work on multiple projects at any one time and balance workload accordingly. The role is required to use tools and techniques related to cloud development in a dynamic, constantly changing environment.
This position is highly responsive to the requests of clients with work priorities largely influenced by these stakeholders.
Significant Working Relationships
Executive Management Team
Sales and Marketing Team
Human Resources Team
Data Practice Lead
Suppliers and Vendors
Deliver data solution architecture for clients (data lakes, cloud data warehouses, data enhancement via machine learning APIs)
Be part of a team on larger scale deployment and deployments
Lead in the development and implementations of the company’s data engineering solutions on GCP
Undertake hands-on development for client projects
Provide guidance and demonstrate best practices when delivering solutions
Understanding of modern data platforms (cloud or on-premises)
Most of the work will be done with tools of the trade (GCP Dataflow or Cloud Data Fusion). Knowledge in both SQL and Python required.
Provide consultancy services to our customers
Develop principles and policies for development of a data lake, cloud data warehouse (CDW), machine learning
Continuous Integration and Infrastructure as Code
Experienced in use of industry-leading CI/CD platforms such as Git, GCP Cloud Build
Can design and implement full CI/CD pipelines for a data solution
Build test driven practices utilising best of breed frameworks for unit and integration testing
Test and UAT environment provisioning for clients
Create and design compelling architectural blueprints to deliver value to our customers
Complete independent vendor evaluation
Implement data engineering and machine learning solution on the GCP technology stack
Implementation assistance on projects
Implement proof of concept prototypes
We will provide access and support to any data engineering training needs on any platforms (GCP, GCP, AWS). You will be able to choose your own path.
Getting certified is highly encouraged
Identify and propose development and architectural improvements to existing systems
Identify and introduce industry best practice standards in all projects, work requests and responsibilities
Seek to educate team members and stakeholders about best practice where appropriate
Work with partner businesses to help drive “thought leadership” on data and cloud architecture
Constantly up skill on latest technologies to remain at the forefront of service delivery
Assist with driving Adaptiv’s thought leadership via social media
Work Planning & Management
Manage daily tasks and projects to ensure work is completed consistent with assigned priorities and deadlines
Assist in evaluating work requests on a technical, architectural and consistency level to provide feedback to stakeholders on viability
Participate in planning meetings as required
Initiate and attend meetings with team and stakeholders to give timely updates on work progress and resolution of issues
Report progress to team and Project Coordinator as appropriate.
Assist the executive team with presales and project establishment
Occupational Health & Safety
Responsible for taking reasonable care to ensure own health and safety at work, and to avoid adversely affecting the safety or health of any other person at work
Ensuring adherence to organisation-wide policies, procedures and monitoring the compliance of all team members
Responsibilities on Demand
Due to the fluid and dynamic environment within the group, new, additional, or amended position responsibilities may be required at any time
Successful demonstration of change orientation is an ongoing responsibility of all positions
Key Result Areas
Act as a subject matter expert in data engineering and GCP data technologies.
Work with client teams to design and implement modern, scalable data solutions using a range of new and emerging technologies from GCP
Work with Agile, DevOps techniques and implementation approaches in the delivery.
Be required to showcase your GCP Data engineering experience when communicating with clients on their requirements, turning these into technical data solutions.
Be required to build and deliver Data solutions using GCP products and offerings.
Willingness to learn and upskill yourself on other cloud platform (Azure, AWS)
Co-own and build our GCP community, contributing in the knowledge exchange learning programme of the platform either via blog post, LinkedIn articles, conferences and roadshows.
Represent the Adaptiv brand on the turf and help us deliver projects on time and on budget.
At least 5 years of professional experience in a consulting environment.
University degree not necessarily required as we believe more in people than diplomas. Experience matters more.
Ideally you should at least have one valid and active GCP certification
Hands on and deep experience working with Google Data Products (e.g. BigQuery, BigTable, Dataflow, Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.).
Experience in Spark (Scala/Python/Java) and Kafka.
Experience working with AI/ML as a service
Knowledge in MDM, Metadata Management, Data Quality, Data Lineage tools and more generally in Data Governance.
End to End Data Engineering and Lifecycle (including non-functional requirements and operations) management.
Regulatory and Compliance work in Data Management.
End to End Solution Design skills – Prototyping, Usability testing and data visualization literacy.
Experience with SQL and NoSQL modern data stores.
Competitive remuneration package
With us, the focus in HR is more on the Human than the Resources
Casual Friday morning tea and early evening drinks are on us
You choose your weapon: either a MacBook Pro or a Windows workstation
Your mobile subscription is on us
A very central office right in the heart of Ponsonby and Grey Lynn