About thinkbridge
thinkbridge is how growth-stage companies can finally turn into tech disruptors. They get a new way there – with world-class technology strategy, development, maintenance, and data science all in one place. But solving technology problems like these involves a lot more than code. That’s why we encourage think’ers to spend 80% of their time thinking through solutions and 20% coding them. With an average client tenure of 4+ years, you won’t be hopping from project to project here – unless you want to. So, you really can get to know your clients and understand their challenges on a deeper level. At thinkbridge, you can expand your knowledge during work hours specifically reserved for learning. Or even transition to a completely different role in the organization. It’s all about challenging yourself while you challenge small thinking.
thinkbridge is a place where you can:
- Think bigger – because you have the time, opportunity, and support it takes to dig deeper and tackle larger issues.
- Move faster – because you’ll be working with experienced, helpful teams who can guide you through challenges, quickly resolve issues, and show you new ways to get things done.
- Go further – because you have the opportunity to grow professionally, add new skills, and take on new responsibilities in an organization that takes a long-term view of every relationship.
thinkbridge.. there’s a new way there. ™
What is expected of you?
As part of the job, you will be required to
- Design, develop, and maintain scalable data pipelines and analytics solutions
- Collaborate with stakeholders to gather requirements and translate business needs into technical solutions
- Develop efficient code with unit testing and code documentation
- Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving
- Optimize and fine-tune data models, SQL queries, and transformations for performance and scalability
- Design, develop, and maintain scalable data models and transformations in conjunction with Snowflake, ensure the effective transformation and load data from diverse sources into data warehouse or data lake.
- Integrate FiveTran connectors to streamline data ingestion from various sources into Snowflake
- Develop custom Python scripts and functions to automate data workflows and enhance system capabilities
- Collaborate with data engineers to automate and orchestrate data pipelines.
- Provide technical expertise and support to resolve data-related issues and troubleshoot system failures
- Collaborate with the API development team to integrate data pipelines with external systems and applications
- Implement techniques for query optimization, caching, and workload management.
- Perform regular audits and troubleshooting to maintain system performance.
- Implement and manage data security measures, including access controls, encryption, and data masking.
- Ensure compliance with data governance policies and regulatory requirements.
- Communicate with all the project stakeholders on the project status
- Lead and participate in initiatives to enhance data platform capabilities and performance.
- Stay updated on emerging trends and technologies in data engineering, cloud computing, and analytics domain
If your beliefs resonate with these, you are looking at the right place!
- Accountability – Finish what you started.
- Communication – Context-aware, pro-active and clean communication.
- Outcome – High throughput.
- Quality – High-quality work and consistency.
- Ownership – Go beyond
Requirements
Must have technical skills
- Bachelor's degree in computer science, Engineering, or a related field
- Proven experience of at least 5 years as a data engineer, ETL developer, or similar role, with a focus on Python and Snowflake
- Strong proficiency in SQL and database concepts, with hands-on experience in Snowflake data warehouse
- Strong proficiency in programming languages such as Python for data manipulation, automation, and scripting. Expertise in Azure durable functions
- Knowledgeable in relational databases, nonrelational databases, data pipelines (ELT/ETL), and file stores
- Knowledgeable in performance tuning and optimization
- Experience with cloud platforms like Azure and tools like Azure OCR for data extraction and processing
- Familiarity with data integration tools like FiveTran
- Knowledge of API development principles and experience integrating data pipelines with external systems
- Familiarity with CI/CD pipelines and version control systems (e.g., Git).
- Proficient in written, verbal and presentation communication (English)
- Ability to work effectively in a fast-paced, collaborative environment and manage multiple priorities
- Excellent analytical, problem-solving, and communication skills
- Ability to work effectively in a fast-paced, collaborative environment and manage multiple priorities
- Snowflake certification (SnowPro Core, SnowPro Advanced)
Our Flagship Policies and Benefits
- Remote First
- Working hours 2.00 PM IST –11.30 PM IST
- No loss of pay for pre-approved leaves.
- Family Insurance
- Quarterly in-person Collaboration Week -WWW