About thinkbridge
thinkbridge is how growth-stage companies can finally turn into tech disruptors. They get a new way there – with world-class technology strategy, development, maintenance, and data science all in one place. But solving technology problems like these involves a lot more than code. That’s why we encourage think’ers to spend 80% of their time thinking through solutions and 20% coding them. With an average client tenure of 4+ years, you won’t be hopping from project to project here – unless you want to. So, you really can get to know your clients and understand their challenges on a deeper level. At thinkbridge, you can expand your knowledge during work hours specifically reserved for learning. Or even transition to a completely different role in the organization. It’s all about challenging yourself while you challenge small thinking.
thinkbridge is a place where you can:
- Think bigger – because you have the time, opportunity, and support it takes to dig deeper and tackle larger issues.
- Move faster – because you’ll be working with experienced, helpful teams who can guide you through challenges, quickly resolve issues, and show you new ways to get things done.
- Go further – because you have the opportunity to grow professionally, add new skills, and take on new responsibilities in an organization that takes a long-term view of every relationship.
thinkbridge.. there’s a new way there. ™
What is expected of you?
As part of the job, you will be required to
- Develop and maintain Python-based backend services for large-scale data collection + processing and customer facing Web services developed in Flask and FastAPI.
- Build efficient, scalable, and maintainable web scraping solutions using tools like Playwright, Crawl4AI or other open source LLM backed scraping.
- Work with large datasets provided by our data partners and collected by our systems and model them efficiently to be served to customers facing applications in sub second latency.
- Work extensively with 3rd party APIs to ingest data from other SaaS products into our own datastore .
- Ensure data integrity and quality through validation, deduplication, and error-handling mechanisms.
- Implement and optimize database queries (SQL and NoSQL) for fast data retrieval.
- Collaborate with cross-functional teams to integrate collected data into internal systems.
- Maintain security and compliance best practices in data scraping and storage.
- Write clean, well-documented, and testable code.
If your beliefs resonate with these, you are looking at the right place!
- Accountability – Finish what you started.
- Communication – Context-aware, pro-active and clean communication.
- Outcome – High throughput.
- Quality – High-quality work and consistency.
- Ownership – Go beyond
Requirements
Must have technical skills
- 3+ years of Python development experience with a focus on backend systems.
- Strong expertise in web scraping techniques and frameworks (Scrapy, Beautiful Soup, Selenium, Playwright).
- Experience handling large-scale data processing and working with big datasets.
- Proficiency in SQL and NoSQL databases (PostgreSQL, MongoDB, Redis, etc.).
- Knowledge of asynchronous programming and multiprocessing in Python.
- Familiarity with cloud services (AWS, GCP, or Azure) for scalable data storage and processing.
- Experience with data pipeline orchestration tools (Airflow, Prefect, or Luigi) is a plus.
- Strong debugging and performance optimization skills.
- Ability to work independently and manage multiple priorities in a fast-paced environment.
- Fluent English communication skills
Good to have skills
- Experience with APIs and automation for data extraction
- Experience with click house DBs is a huge plus
- Familiarity with containerization technologies (Docker, Kubernetes).
- Knowledge of machine learning or NLP for data analysis is a plus.
- Experience working with LLMs in a scalable and reliable manner Additional Requirement
Our Flagship Policies and Benefits
- Contract (Renewable based on performance)
- Remote First
- Working hours USA (PST)
- No loss of pay for pre-approved leaves.