Primary Responsibilities: • Design and develop Extract, Transform, and Load (ETL) processes to support MI/BI efforts, with strong knowledge of data warehousing concepts. • Develop advanced and complex ETL pipelines and custom shell/Python scripting. • Design, develop, and maintain Tableau dashboards and data visualizations to support business intelligence and management reporting needs; connect Tableau to enterprise data sources including Greenplum/PostgreSQL, DB2, SQL Server, and Oracle. • Perform data mapping and source-to-target analysis across multiple data sources including DB2, Oracle, Greenplum, and flat files. • Read, debug, modify, and create complex SQL code associated with a data warehouse environment. • Load and integrate data from multiple source systems into Greenplum MPP database; apply expert knowledge of data distribution, table partitioning, and compression strategies. • Design, develop, test, and optimize ETL solutions for processing large volumes of data; implement efficient migration processes to move ETL objects across development, test, and production environments. • Translate business requirements into technical specifications; collaborate with Business Analysts, Data Architects, Enterprise Architects, and ETL Architects to ensure solutions meet requirements. • Identify and promote reuse of ETL components and services to avoid duplicative implementations and reduce technical debt. • Work with configuration management to maintain software versions using Bitbucket/Git; prepare and deploy software packages per SSA change management standards. • Prepare and maintain comprehensive technical documentation; deliver knowledge transfer sessions to SSA staff as needed. • Ensure compliance with SSA security, privacy, and data management policies throughout all development activities. • Mentor junior team members in developing, documenting, and modifying ETL components and best practices. • Participate in all design reviews, requirement sessions, and team problem-solving efforts; communicate technical concepts effectively to both technical and non-technical stakeholders.
Minimum Qualifications: • Bachelor’s degree in Computer Science, Information Technology, or a related field with 10+ years of relevant ETL development experience. Additional years of relevant experience may be accepted in lieu of degree. • Linux/UNIX proficiency including Bash shell scripting and Linux OS administration in a database and ETL context. • Hands-on experience with ETL tools such as Informatica, Ab Initio, and Syncsort/Precisely; proficiency with PSQL, file transfer protocols (FTP/SFTP), and Bitbucket for version control. • Experience scripting in a Linux environment to automate ETL solutions and data migration jobs. • Strong SQL proficiency across multiple platforms; experience with PL/pgSQL for PostgreSQL and Greenplum environments, and PL/SQL for Oracle environments; ability to select the appropriate approach based on platform and use case. • Extensive experience with SQL mapping and integration across DB2, Oracle, and Greenplum database platforms. • Strong experience extracting data from multiple source systems including DB2, Oracle, PostgreSQL, Greenplum, and flat files. • Tableau dashboard development experience including connecting to enterprise data sources, building calculated fields and parameters, designing interactive dashboards and scorecards, and publishing workbooks to Tableau Server for BI/MI reporting. • Full comprehension of SQL, cardinality, levels of granularity, normalized vs. de-normalized data models, and data architecture best practices. • Hands-on experience designing and developing ETL solutions involving large volumes of data; experience with large-scale, complex data migration efforts. • Strong understanding of data recovery and job rerun procedures in an ETL environment. • Working knowledge of developing optimal code to maintain high performance when processing large volumes of data. • Strong attention to detail with excellent analytical, organizational, and troubleshooting skills. • Excellent written and verbal communication skills. • Ability to obtain and maintain a Public Trust clearance (U.S. citizenship or lawful permanent residence required). Desired Qualifications: • Active Public Trust or higher-level security clearance. • Prior experience supporting federal government clients, particularly SSA or other large civilian agencies. • Strong Database skills. • Experience with Python scripting for ETL automation, data pipeline development, and operational tasks. • Familiarity with Agile/Scrum methodologies and DevSecOps practices in a federal IT environment. • Experience with Jira and Confluence. • Knowledge of FISMA compliance requirements and experience working within an ATO (Authority to Operate) framework. • Tableau Desktop Specialist or Tableau Certified Data Analyst certification. • Informatica AXON or Ab Initio professional certification
Use your Microsoft, Google, LinkedIn, or GitHub account.
Sign In to Apply