Location: Austin, Texas, United States
Organization: InCommodities North America
Application deadline: We’ll review applications on an ongoing basis, so don’t wait too long to send yours
Turn complex market feeds into data that drives real trading impact.
We’re expanding our Technology team in Austin and looking for a Market Data Integration Engineer to join our crew. You’ll be the one ensuring our traders have access to fast, reliable, and high-quality market intelligence – because every millisecond and data point counts.
If you love building systems that handle real-world complexity, optimizing pipelines, and seeing your work impact trading decisions in real time, this one’s for you.
What you’ll do
- Design and build robust data integration processes that capture market data from APIs, real-time feeds, FTPs, and web interfaces
- Develop smart extraction systems to handle diverse formats (JSON, XML, CSV, HTML) – and make sure bad data doesn’t slip through
- Implement monitoring and alerting that keep data quality, timeliness, and completeness on point
- Optimize collection pipelines for performance and reliability – we’re talking low latency, high accuracy
- Work closely with traders and analysts to explore new data sources and prototype integrations fast
- Maintain and enhance existing pipelines to ensure everything runs smoothly during live markets
- Collaborate with the platform team to connect your work to our Kafka/Flink streaming infrastructure
- Document sources, patterns, and troubleshooting steps for the team’s knowledge base
- Join code reviews, share ideas, and help shape our best practices for data integration
Your Team
You’ll join a skilled, fast-moving team that thrives at the intersection of trading and technology. We collaborate closely with our trading desks, data analysts, and engineers to keep InCommodities ahead of the market. Expect autonomy, direct impact, and teammates who value execution, curiosity, and craft.
What you bring
- 1–3+ years of software development experience with solid
- You have a bachelor’s in computer science or relevant degree
- You have experience with API integrations and web scraping tools (BeautifulSoup, Scrapy, Selenium)
- You have a strong understanding of data formats (JSON, XML, CSV, HTML DOM manipulation)
- You have familiarity with event-driven architectures or streaming (Kafka experience is a plus)
- You have a problem-solving mindset – you can handle messy, unreliable data sources and make them work
- You have basic Java knowledge for debugging framework-level issues
- You have experience with error handling and data quality validation
- You are comfortable with Git and DevOps basics
How you work
- You are a clear communicator and can explain data issues to traders and analysts
- You are execution driven – you move fast but don’t compromise reliability
- You are curious and adaptable – new data sources don’t scare you
- You are collaborative – you share knowledge and help keep the codebase strong
- You are calm under pressure – you know trading environments don’t wait
What you can expect from us
You’ll join with a steep learning curve supported by a thorough onboarding program that sets you up for success from day one. From there, you’ll enjoy a high level of autonomy, working alongside exceptional colleagues who are bold, honest, ambitious, and among the smartest in their fields. Our culture is unique—we live our values of Honesty, Transparency, and Rethink, and we do it without rigid hierarchies or dress codes. Along the way, you’ll take part in great events, both within your team and across the organization, and you’ll have the chance to grow on both a professional and personal level.
Benefits
- Flexible working hours and hybrid work schedule
- Health insurance
- 401k with employer matching
- Parental leave including paid and unpaid options
- Generous PTO policy and paid holidays
- Monthly Door Dash allowance and snacks and drinks at the office
- Internet and optional cellphone reimbursement
- Access to office gym