WE SPECIALISE IN FINDING FANTASTIC OPPORTUNITIES
FOR DIGITAL AND DATA SPECIALISTS WITH THE MOST INNOVATIVE BUSINESS ACROSS EUROPE AND THE USA.
New York, USA
15 days ago
Roles and Responsibilities
- Data Flows: Ensure that data flows accurately from all sources to Snowflake and meets established SLAs. Transforming our raw internal data into custom attributes for CRM systems.
- System Administration: Monitor the performance of Snowflake and introduce optimizations. Manage regular Airflow deployments and updates.
- Project Management: Gather business requirements and create technical implementations and end-user documentation.
- Reporting Automation: Create Looker Looks and Dashboards for internal and external audiences.
- Company Data Knowledge Center: Become a go-to expert around internal company data and train people on how to interpret the documentation and use Looker explores.
- Data Modeling: Active participation in data modeling and data structure design with the Product and Engineering teams.
- 4-6 years' experience in a similar role at a technology company - ideally with an e-commerce focus.
- Experience with Thimble's current tech and data stack: Snowflake, Apache Airflow, dbt, Dask, Looker, Appsflyer (Data Locker)
- Excellent written and verbal communication skills.
- Strong SQL skills are a core technical requirement.
- Strong Python skills are necessary to succeed in this role.
- Prior experience with Dask is a plus.
- Prior experience with event stream data is a plus.
- Knowledge of statistics and machine learning is a plus.
What you'll do in 30 days:
- Understand Thimble's product offering and differentiation
- Understand Thimble's organizational structure, the function of each individual team, and how to operate together as it relates to your position
- Understand Thimble's end-to-end purchase experience and product offerings
- Gain access to and become up-to-speed with our current tech stack and its setup
- Gain familiarity with Snowflake, Looker, etc.
What you'll do in 60 days:
- Gain familiarity with Appsflyer and Data Locker
- Engage with business teams to understand attribution project goals
- Develop an implementation plan for attribution project
- Start executing attribution project implementation plan
What you'll do in 90 days:
- Gain familiarity with Iterable and other CRM data
- Engage with business teams to understand CRM project goals
- Develop initial implement plan CRM data project
- Start executing CRM project implementation plan
- Comprehensive health, vision, and dental coverage.
- 23 days paid vacation plus company-wide holidays.
- Ability to work remotely bar 1-2 days per month onsite meetings in New York City.
- Company computer hardware of your choice and a generous contribution to your remote office.
- Virtual trivia and field trips.
- Competitive pay.
- Generous stock options.
Darwin Recruitment is acting as an Employment Agency in relation to this vacancy.
We are looking for a Data Engineer to join one of our clients in Amsterdam. Requirements: Experience with Big Data Architectural best practices Cloud Experience (GCP, AWS) Experience with solution building and architecting with public cloud offerings such as GCP (e.g. BigQuery/DataPrep/DataStudio) or AWS (e.g. S3/Redshift/Athena) Great understanding in building RESTFul APIs Do you see yourself fit? Are you immediately available to join? Apply immediately and we will get in touch! Darwin Recruitment is acting as an Employment Business in relation to this vacancy.
If you are a Data Engineer and your looking for a new opportunity, I have a prestigious and exciting client that has a great opportunity - The client is based in Germany but this role will be fully remote Excellent SQL experience Excellent Python programming skills Experience with streaming Experience in working with Cloud Native Technology like GCP or AWS would be beneficial Experience with workflow orchestration like Airflow or Apache An interest to acquire in-depth knowledge about the commercial feed tool ProductsUp Please send your updated CV and also the best number to reach you on. Darwin Recruitment is acting as an Employment Business in relation to this vacancy.
For a global leader in the HR services industry we are looking for an Cloud FinOps Data Engineer. You will be responsible for the cloud optimization and financial practices towards a sustainable cost optimization lifecycle. The efficient management of data is crucial and we need your expertise for scaling out and up while keeping the quality of the delivered services. If this sound interesting, please apply now! Tasks · You will be interacting with cloud optimization engineers, technical account managers, operations team, subject matter experts, cloud architects, business/finance controllers. · You love thinking along with the team how to migrate/optimize, build and architect the current data model and methods into a new data management model by making use of public cloud offerings such as GCP (BigQuery/DataPrep/DataStudio) or AWS (S3/Redshift/Athena) while keeping an automation mindset. · You will have the opportunity to apply your background in building analytics data models under quality standards (availability, integrity, confidentiality, actuality, frequency and completeness) to support the team in a broad range of analytical requirements. · You will own the data model end-to-end so your skills will come in handy not just to collect, extract, and clean the large-scale data within cloud billing and usage of multiple clouds and other sources, but also to understand the systems that generated it. · You will be responsible for improving and scaling the data by adding new sources, coding business rules, and producing new metrics that support the team in the creation of relevant reports and scorecards that will ultimately be used to drive discussions with c-level customers and drive decisions. Requirements · 5+ years experience in a similar role; · Experience with solution building and architecting with public cloud offerings such as GCP (BigQuery/DataPrep/DataStudio) or AWS (S3/Redshift/Athena). · Experience in test automation and ensuring data quality across multiple datasets used for analytical purposes. · Experience with Big Data architectural best practices. · Solid understanding in building RESTful APIs. Offer · Duration is 6-12 months with extension. · 36-40 hours per week. · You must live in the Netherlands! Apply now! If you are interested in this position of Cloud FinOps Data Engineer please apply now or contact me via firstname.lastname@example.org or 0203050061. Darwin Recruitment is acting as an Employment Business in relation to this vacancy.
Datawarehouse Developer/Data Engineer Would you like the opportunity to not only work with the latest tech but also have the chance to do some good in the world? Darwin Recruitment are currently partnered with an industry leading global product company. The business is a subscription based, not for profit organisation who last year donated over 1 billion SEK to charity. They have 5 products within their business that generate the money which is subsequently donated to charity. As a business they have 1 million customers in this country, with many other countries also under the same brand. We are currently looking for either a Datawarehouse Developer or Data Engineer to join their technology team. In recent times they have fully migrated over to AWS and are looking to add one more person into the team. Your role will include: Recruitment analysis Design and development of new functionality Develop and maintain the data warehouse architecture Implementation of new data models and ETL processes Your skills and experience will include: Advanced knowledge of SWL Programming experience and knowledge with either Java or Python Experience of cloud based DW solutions Ideally experienced in Machine Learning An understanding of Swedish If you would like to find out more information apply or email email@example.com Darwin Recruitment is acting as an Employment Agency in relation to this vacancy.