Overview:
LMI is seeking a skilled Data Engineer. Successful Data Engineers demonstrate competency in data pipelining, data analysis, statistics, programming, project execution, and critical thinking.
LMI is a consultancy dedicated to powering a future-ready, high-performing government, drawing from expertise in digital and analytic solutions, logistics, and management advisory services. We deliver integrated capabilities that incorporate emerging technologies and are tailored to customers’ unique mission needs, backed by objective research and data analysis. Founded in 1961 to help the Department of Defense resolve complex logistics management challenges, LMI continues to enable growth and transformation, enhance operational readiness and resiliency, and ensure mission success for federal civilian and defense agencies. We believe government can make a difference, and we seek talented, hardworking people who share that conviction.
Responsibilities:
This Data Engineer will join a team supporting the Defense Logistics Agency (DLA). The Data Engineer will:
- Frame and scale data problems to analyze, visualize, and find data solutions.
- Manipulate common data formats, including comma-delimited, text files, and JSON.
- Derive insights and analytic narratives from data and visualizations for effective storytelling and clear communication in response to research questions.
- Build robust, scalable data pipelines using technologies such as StreamSets, Snowflake, and Immuta.
- Apply critical and analytical thinking skills to translate complex information into understandable and impactful work products.
- Oversee and complete special projects as needed.
- Rapidly prioritize competing requirements, understand and simplify client requirements.
- Communicate with clients through written reports and oral presentations.
Qualifications:
Required:
- Bachelor’s degree in data science, mathematics, statistics, economics, computer science, engineering, or a related business or quantitative discipline.
- Experience with leading and/or supporting data operations teams to develop architecture, policies, extract-transform-load (ETL) data pipelines, and data models.
- Experience working with tools, including object-oriented programming (Python, Java), database and ETL tools (StreamSets, Snowflake, Immuta, pyspark), and associated data science libraries (scikit-learn).
- Data science methods related to data architecture, data munging, data and feature engineering, and predictive analytics.
- Working knowledge of databases and SQL; preferred qualifications include linking analytic and data visualization products to database connections.
- Superior communication skills, both oral and written.
Desired:
- DoD experience preferred.
- Unstructured text and natural language processing.
- Previous experience working with DoD or supply chain data processes.
- Developing data mining, statistical network, natural language processing, text analytics, and graph-based algorithms to analyze massive data sets.
- Supervising algorithm implementation in on-premise and cloud-based computing environments.