Skip to main content

This job has expired

Lead Data Transformation Developer

Employer
Octo Consulting Group
Location
Chantilly, VA
Closing date
Jan 26, 2022

View more

Octo is an industry-leading, award-winning provider of digital services for the federal government. Octo specializes in providing agile software engineering, user experience design, cloud services, and digital strategy services that address government's most pressing missions. Octo delivers intelligent solutions and rapid results, yielding lower costs and measurable outcomes. Our team is what makes Octo great. At Octo you'll work beside some of the smartest and most accomplished staff you'll find in your career. Octo offers fantastic benefits and an amazing workplace culture where you will feel valued while you perform mission critical work for our government. Voted one of the region's best places to work multiple times, Octo is an employer of choice! Job Description You As a Lead Engineer, you will be joining the team that is deploying and delivering a cloud-based, multi-domain Common Data Fabric (CDF), which provides data sharing services to the entire DoD Intelligence Community (IC). The CDF connects all IC data providers and consumers. It uses fully automated policy-based access controls to create a machine-to-machine data brokerage service, which is enabling the transition away from legacy point-to-point solutions across the IC enterprise. Us We were founded as a fresh alternative in the Government Consulting Community and are dedicated to the belief that results are a product of analytical thinking, agile design principles and that solutions are built in collaboration with, not for, our customers. This mantra drives us to succeed and act as true partners in advancing our client's missions. Program Mission The CDF program is an evolution for the way DoD programs, services, and combat support agencies access data by providing data consumers (eg, systems, app developers, etc.) with a "one-stop shop" for obtaining ISR data. The CDF significantly increases the DI2E's ability to meet the ISR needs of joint and combined task force commanders by providing enterprise data at scale. The CDF serves as the scalable, modular, open architecture that enables interoperability for the collection, processing, exploitation, dissemination, and archiving of all forms and formats of intelligence data. Through the CDF, programs can easily share data and access new sources using their existing architecture. The CDF is a network and end-user agnostic capability that enables enterprise intelligence data sharing from sensor tasking to product dissemination. Skills & Requirements Responsibilities Managing and leading a diverse technical team involved in developing, deploying, and configuring an enterprise streaming data transformation capability. This position will enable the sharing of legacy and modern IC data resources across the Common Data Fabric to consumers of intelligence information across the enterprise. In this role you will: Lead the design and implementation of a large-scale streaming data ingest and transformation system supporting interoperability with legacy IC systems Manage the conversion of native data to community defined formats for consumption via legacy interfaces Conduct research and development against legacy baselines to test & evaluate interoperability Support the security accreditation and fielding of the system to customer environments Document SOPs related to system deployment, configurations, transforms/crosswalks, and monitoring Support the preparation of all required customer deliverables as specified and scheduled in the statement of work What we'd like to see Understanding of foundational ETL concepts and COTS/FOSS ETL tools. Experience working with Streaming data tools such as Talend, StreamSets, AWS Data pipelines, Apache Nifi, Apache Kafka Experience developing with Solr & Java Demonstrable CentOS command line knowledge Working knowledge of RESTful APIs A minimum of 5 years of experience managing IT teams A minimum of 3 years of experience with programming and software development including analysis, design, development, implementation, testing, maintenance, quality assurance, troubleshooting and/or upgrading of software systems Experience leveraging best practices, such as unit test integration, to ensure high code quality standards are met Working knowledge of agile lifecycle management tools such as JIRA and Confluence Experience working with source code repositories/Git. Experience on most phases of application development activities with minimal supervisory guidance Experience with unit and integration testing of the developed software Coordinates and performs requirements and implementation analysis, design, development, testing, and enhancement of application software Ability to obtain DoD 8570 IAT Level II Certification Desired Skills: Working understanding of HBase, WebHDFS, Kafka and Nifi Previous experience developing an enterprise streaming data application Advanced organizational skills with the ability to handle multiple assignments Strong written and oral communication skills Years of Experience: 10 years of experience or more. Education: Bachelor's degree in Engineering, Computer Science, or other related analytical, scientific, or technical discipline Location: Chantilly, VA Clearance: Active TS/SCI w/ ability to obtain CI Poly Please note: You must be able to comply with Executive Order No 13991 and Executive Order No 14042.Octo is an Equal Opportunity/Affirmative Action employer. All qualified candidates will receive consideration for employment without regard to disability, protected veteran status, race, color, religious creed, national origin, citizenship, marital status, sex, sexual orientation/gender identity, age, or genetic information. Selected applicant will be subject to a background investigation.

Get job alerts

Create a job alert and receive personalized job recommendations straight to your inbox.

Create alert