Described as the "Uber of Content", Social Native is a marketplace technology company that empowers marketers to create, source and optimize authentic visual content in the most efficient way possible. Leveraging the world's first AI-powered creative platform, brands such as Unilever, Adidas, L'Oréal, Crocs and Nestlé Waters partner with Social Native to improve the performance of their paid and organic social strategy with a combination of Influencer Marketing, Custom Content, and Content Editing solutions. With our recent acquisition of Olapic, we're changing the way marketers evaluate, refine and optimize their visual content strategy. This move solidifies our goal of delivering an all-in-one platform providing brands with data-driven insights, scales content creation, measures the impact of their work, and optimizes content and influencer strategy for even greater results.
Responsibilities:
1. Create and maintain data pipelines to ensure data availability in the data warehouse.
2. Identify, define and implement data transformations to fulfill functional and performance requirements.
3. Optimize the data infrastructure to satisfy business and technical needs.
4. Understand technical details about our products to implement transformational logic in the data pipelines.
5. Ensure data integrity and consistency in the data warehouse.
6. Communicate with other technical teams to obtain and share knowledge regarding operational impact on data.
7. Build Analytics tools based on the data delivered by the pipelines to provide insights to customers and internal stakeholders.
8. Build ad-hoc reports, dashboards using visualization tools.
Qualifications:
1. 5+ years of experience as a Data Engineer or other technical position centered around data.
2. Advanced SQL knowledge and experience working with relational databases.
3. Strong knowledge of a general-purpose programming language such as Python.
4. Experience building and optimizing data pipelines, infrastructure and architecture.
5. Experience working with scheduling tools like Airflow.
6. Experience with AWS, particularly Redshift.
7. Experience performing root cause analysis and diving into data to identify issues, bugs and improvement opportunities.
8. Experience working with ETL platforms.
Nice to Have:
1. Knowledge of Machine Learning algorithms and experience deploying models.
2. Experience working with Product and Engineering teams in a dynamic environment.
3. Start up experience at a scaling organization.
#J-18808-Ljbffr