Join a team of passionate technologists driving innovation in AI/ML, data engineering, cloud, and cybersecurity — with meaningful work on federal and commercial missions that matter.
We invest in our people because our people deliver the mission. Here's what you'll find when you join IBIS.
Market-leading salaries, performance bonuses, and equity participation for senior roles — reviewed annually.
Comprehensive medical, dental, and vision coverage for you and your family, including FSA and HSA options.
Annual training budget, paid certification exams (AWS, Azure, CISSP, and more), and LinkedIn Learning access.
Hybrid and remote-friendly roles with flexible hours — we focus on outcomes, not clock-watching.
Structured career ladders, mentorship programs, and real advancement opportunities based on merit.
We actively sponsor security clearances — a powerful career asset for federal technology professionals.
Primary Responsibilities: • Administer, configure, monitor, and maintain PostgreSQL Community Edition, Greenplum, and IBM DB2 databases across development, QA, staging, and production environments. • Design, implement, and manage database architecture solutions, including schema design, data modeling, clustering, replication, and partitioning strategies across all supported platforms. • Perform performance tuning and query optimization for PostgreSQL, Greenplum (MPP query optimization, distribution keys, resource queues), and DB2 workloads. • Develop and maintain backup, recovery, and disaster recovery procedures for all database platforms; conduct periodic DR testing to validate recoverability. • Configure and manage high availability (HA) and failover solutions (e.g., Patroni/streaming replication for PostgreSQL, Greenplum segment mirroring, DB2 HADR). • Troubleshoot and resolve database issues including integrity problems, performance bottlenecks, blocking/deadlocking, replication failures, connectivity issues, and security vulnerabilities. • Create, test, and deploy automation scripts (Python, Bash, or Shell) to streamline database operations, patching, monitoring, and routine maintenance tasks. • Establish and manage database user accounts, roles, and permissions; enforce least-privilege security controls in compliance with SSA and federal security standards (FISMA, NIST 800-53). • Set up alerting and monitoring frameworks (e.g., Nagios, Prometheus, Grafana, native DB monitoring) to ensure system health, availability, and proactive incident response. • Conduct capacity monitoring and short- and long-term capacity planning in collaboration with application developers, system architects, and infrastructure teams. • Develop and maintain technical documentation for database build, configuration, patching, upgrade, and operational procedures. • Collaborate with application development, architecture, and DevOps/release management teams to provide DBA guidance for new deployments, migrations, and schema changes. • Participate in technical reviews and walkthroughs of system design documentation; provide architectural recommendations aligned with SSA standards. • Support database migrations and platform upgrades, including PostgreSQL version upgrades, Greenplum cluster expansions, and DB2 maintenance releases. • Provide on-call support and off-hours maintenance windows as scheduled; apply production packages during non-peak hours. • Adhere to SSA’s change management, incident management, and SDLC processes and procedures.
Primary Responsibilities: • Design and develop Extract, Transform, and Load (ETL) processes to support MI/BI efforts, with strong knowledge of data warehousing concepts. • Develop advanced and complex ETL pipelines and custom shell/Python scripting. • Design, develop, and maintain Tableau dashboards and data visualizations to support business intelligence and management reporting needs; connect Tableau to enterprise data sources including Greenplum/PostgreSQL, DB2, SQL Server, and Oracle. • Perform data mapping and source-to-target analysis across multiple data sources including DB2, Oracle, Greenplum, and flat files. • Read, debug, modify, and create complex SQL code associated with a data warehouse environment. • Load and integrate data from multiple source systems into Greenplum MPP database; apply expert knowledge of data distribution, table partitioning, and compression strategies. • Design, develop, test, and optimize ETL solutions for processing large volumes of data; implement efficient migration processes to move ETL objects across development, test, and production environments. • Translate business requirements into technical specifications; collaborate with Business Analysts, Data Architects, Enterprise Architects, and ETL Architects to ensure solutions meet requirements. • Identify and promote reuse of ETL components and services to avoid duplicative implementations and reduce technical debt. • Work with configuration management to maintain software versions using Bitbucket/Git; prepare and deploy software packages per SSA change management standards. • Prepare and maintain comprehensive technical documentation; deliver knowledge transfer sessions to SSA staff as needed. • Ensure compliance with SSA security, privacy, and data management policies throughout all development activities. • Mentor junior team members in developing, documenting, and modifying ETL components and best practices. • Participate in all design reviews, requirement sessions, and team problem-solving efforts; communicate technical concepts effectively to both technical and non-technical stakeholders.
Position Description: · Design and develop APIs using tools such as Postman, OpenAPI, and/or Swagger, Soap, Prometheus, Grafana, and OpenShift. · Expertise in Java software development. · Collaboration with product design and engineering teams to develop an understanding of business needs. · Participate in all Agile ceremonies.
A Test Automation Engineer designs, develops, and maintains automated test scripts and frameworks to validate software functionality, efficiency, and reliability. They work within Agile teams to create test plans, execute automated tests, identify bugs, and integrate tests into CI/CD pipelines to ensure high-quality software releases. Key Responsibilities · Framework Development: Build, customize, and maintain automated testing frameworks and tools. · Script Creation: Write, execute, and debug automated test scripts for web, API, or mobile applications. · Test Strategy: Review requirements to create comprehensive test plans and define the scope of automation · Defect Management: Identify, document, and track bugs, ensuring resolution through regression testing. · CI/CD Integration: Integrate test suites into CI/CD pipelines (e.g., Jenkins, Git) for continuous testing. Extensive experience with Bitbucket, Jira, and Confluence · Collaboration: Work with developers and product managers to ensure comprehensive test coverage.
Position Description: · Design and develop applications using Angular, Java and SpringBoot. · Mentor Junior staff. · Collaborate with product design and engineering teams to develop an understanding of needs. · Participate in all Agile ceremonies.
Position Description: · Develop models using Python, SAS, or R. · Collaborate closely with Product Owners and other SMEs. · Present findings to senior staff and make recommendations for improvements. · Mentor junior staff. · Participate in Agile ceremonies, as required.
Position Description: · Design Confluent Kafka cluster environments, configure and manage Kafka instances, and monitor system performance. · Ensure data integrity and availability in a big data environment. · Expertise in a programming language, such as Java or Python. · Collaborate with product design teams and SMEs to understand data pipeline needs. · Participate in all Agile ceremonies.
We are seeking a Senior Data Analyst to join our growing analytics team. You will design and implement data solutions for federal clients, develop dashboards, and translate complex data into actionable insights for stakeholders.
We're always interested in meeting talented people. Send us your resume and tell us what you do best — we'll keep you in mind for upcoming roles that match your skills.