Want to get to the next step in your international career? We can support you!
Ubiminds is a GPTW certified, people first company that partners with American software product companies to scale their development footprint. Ubi custom-curates Brazilian top 5% talent for their LATAM strategy, offering a unique combo of staff augmentation and employer-of-record services.
Ubiminds is looking for a Data Engineer to join Paper’s growing R&D and analytics team.
Driven by the mission to democratize education, Paper is the leader in personalized learning. Partnering with innovative schools and school districts, Paper helps deliver true educational equity through their category-leading Educational Support System (ESS) that offers virtual access to 24/7 tutors and essay reviewers.
Join a company partnered with over 700 schools, supporting over 750,000 students to reach their academic potential, independent of socioeconomic status, geography, language, or other barriers - what a great mission!
Perks and Benefits
As Data Engineer @Ubiminds, you:
Are placed in a product-based company, with the same treatment as their full-time employees.
Have our full back-office support, from career guidance to HR and concierge services.
Enjoy our remote-first policy – we are a distributed team, after all.
Get your own MacBook (none of that "bring your own device" stuff here).
Have access to growth opportunities with other amazing technology professionals, through tech talks, chapter meetings, and even remote happy hours for tons of fun!
Improve your English through free lessons with a native English speaker - get to the next level on your communication skills!
Miss working in the office? Our cool Florianópolis headquarters is available, whenever you want, with weekly quick massages & tasty snacks, soft drinks, and games
The Data Engineer will be responsible for expanding and optimizing the data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products. The right candidate will be excited by the prospect of designing and optimizing their data architecture to support the company's growth.
What you’ll do as a Data Engineer at Ubiminds
Create and maintain a product API for data requests
Design data models, utilizing patterns such as star schemas, data vault, 3NF, temporal, and flat dimensional/fact as appropriate
Develop of micro-batch/real-time idempotent data pipelines
Create CI/CD pipelines to automate the deployment of data pipelines to production
Participate in architecture reviews and provide input on the cloud data pipeline tools
Ensure the code is versioned, data quality checks exist in each stage of the pipeline
Collaborate with CloudOps on the creation of accounts, roles, resource provisioning, and continuous deployment of the data pipeline.
Develop metrics to track/monitor the pipeline and notify/respond in case of anomalies
Able to migrate and integrate data from AWS RDS, Salesforce, etc. to BigQuery
Able to take backup, restore and upgrade databases to major version
In order to succeed in this position, you will need
Vast experience in a data engineering role
Proficiency with SQL, Python/Pandas, Bash, and columnar databases/cloud data warehouse
Strong analytic skills related to working with semi/unstructured datasets.
Experience building processes supporting data transformation, data structures, metadata, dependency, and workload management.
Working experience with ownership of the end-to-end data pipeline with big data workloads and serverless/distributed processing using tools/services similar to PubSub, Dataflow, BigQuery, and DBT.
Ability to explain concepts such as compute layer, storage, nodes, concurrency, cache, optimizer, authentication, parallelism, materialized views, execution statistics, and query plan
Nice to have
Experience with Google cloud services and stream-processing systems is a plus
Experience with data orchestration/workflow management tools: Azkaban, Luigi, Airflow, Prefect, etc.