Overview
Responsibilities:
- Support the design and development of complex data pipelines
- Support extract, transform, load (ETL) efforts to ensure delivery of high-quality data
- Support prototyping and development activities for data-intensive applications
Qualifications & Skills:
- TS/SCI with Poly
- Bachelor’s degree in Computer Science, Computer Engineering, or a related discipline
- Experience with large-scale data processing software like Apache Spark or Hadoop
- Experience working within a cloud environment (AWS, Azure)
- Experience with SQL databases (e.g., Postgres); NoSQL experience is a plus
- Experience with developing and consuming RESTful APIs and services
- Programming/development skills and experience working in at least one of the following: Python, Java, C++, Scala, R
- Understanding of distributed systems
Bridge Core is proud to be an equal opportunity workplace and affirmative action employer. We celebrate diversity and are committed to creating an inclusive environment for all team members and applicants. At Bridge Core, we ensure fair treatment for our team members and applicants based on their abilities, achievements and experience without regard to race, national origin, sex, age, disability, veteran status, sexual orientation, gender identity or any other classification protected by law.