About thinkbridge
thinkbridge is how growth-stage companies can finally turn into tech disruptors. They get a new way there – with world-class technology strategy, development, maintenance, and data science all in one place. But solving technology problems like these involves a lot more than code. That’s why we encourage think’ers to spend 80% of their time thinking through solutions and 20% coding them. With an average client tenure of 4+ years, you won’t be hopping from project to project here – unless you want to. So, you really can get to know your clients and understand their challenges on a deeper level. At thinkbridge, you can expand your knowledge during work hours specifically reserved for learning. Or even transition to a completely different role in the organization. It’s all about challenging yourself while you challenge small thinking.
thinkbridge is a place where you can:
- Think bigger – because you have the time, opportunity, and support it takes to dig deeper and tackle larger issues.
- Move faster – because you’ll be working with experienced, helpful teams who can guide you through challenges, quickly resolve issues, and show you new ways to get things done.
- Go further – because you have the opportunity to grow professionally, add new skills, and take on new responsibilities in an organization that takes a long-term view of every relationship.
thinkbridge.. there’s a new way there. ™
What is expected of you?
As part of the job, you will be required to
Data Pipeline Design and Development:
- Build and manage scalable, reliable ETL/ELT pipelines to process structured and unstructured data.
- Use Azure Data Factory (ADF) to orchestrate data workflows and implement integration solutions.
- Leverage AWS Redshift and Snowflake to store and process large-scale datasets efficiently.
Data Modeling and Warehousing:
- Design and implement efficient data models and schemas for Snowflake and Redshift to support analytics and reporting needs.
- Optimize data warehouse performance, ensuring cost efficiency and scalability.
Data Integration and Migration:
- Perform data migration activities between on-premises systems and cloud platforms.
- Integrate data from diverse sources, including APIs, cloud storage, and legacy systems.
Performance Optimization:
- Monitor and fine-tune Snowflake, Redshift, and ADF workflows to enhance query performance and pipeline reliability.
- Troubleshoot issues related to data processing and platform performance.
Collaboration:
- Partner with Data Scientists, Analysts, and stakeholders to understand data requirements and provide actionable solutions.
- Collaborate with DevOps and cloud teams to manage infrastructure and automate deployments.
Documentation and Best Practices:
- Maintain thorough documentation of data pipelines, workflows, and system architecture.
- Follow data governance, security, and compliance standards across all projects.
If your beliefs resonate with these, you are looking at the right place!
- Accountability – Finish what you started.
- Communication – Context-aware, pro-active and clean communication.
- Outcome – High throughput.
- Quality – High-quality work and consistency.
- Ownership – Go beyond.
Requirements
Must have technical skills
- Must have 3-5 Years of experience in ADF for scalable workflows.
- Should have experience in design efficient models and optimize performance in Snowflake and AWS Redshift.
- Hands on experience in migrate and integrate data from APIs, cloud storage, and legacy systems.
- Optimize pipelines and resolve processing issues.
- Work with cross-functional teams (Data Scientists, Analysts, DevOps).
- Strong communication skills for effective collaboration with cross-functional teams.
Our Flagship Policies and Benefits
- Remote First
- Working hours 2.30 PM IST –11.30 PM IST (3 AM CST to 12 PM CST)
- No loss of pay for pre-approved leaves
- Family Insurance
- Quarterly in-person Collaboration Week -WWW