Data Analyst III

1 Day ago • 5 Years + • $45,777 PA - $73,243 PA
Data Analysis

Job Description

This Data Analyst III role supports a mission-driven program focused on modernizing healthcare data management for federal employees. The team builds and maintains a secure, data-driven platform to streamline health benefits data collection, integration, and management. Responsibilities include automating data feedback loops, managing provider and pharmacy files, overseeing bi-directional file transfers, and ensuring data quality and security across internal and external systems, utilizing Azure, Databricks, Linux, and Ansible in an agile environment.
Must Have:
  • Conduct and manage the automation of monthly data feedback loops for enrollment and provider files.
  • Manage incoming FEHB and PSHB monthly provider files, including performing initial validation checks, issuing feedback and reminder notifications for late submissions, and troubleshooting related issues.
  • Oversee and manage the bi-directional file transfer processes across multiple external systems, including USPS, SSA, RS, and NFC-CLER.
  • Manage the automation and monitoring of feedback loops for monthly provider file submissions.
  • Manage incoming pharmacy-related files (RXCU, RXMP, Formulary, FDB, Medispan, RXNORM) through SFTP file transfers and direct data pulls.
  • Perform ongoing data and file management, including monitoring for notifications, error outputs, and ensuring successful completion of file transfers and data exchanges across internal and external systems.
  • Manage Linux and Ansible environments to support file operations, automation processes, and data management tasks.
  • Oversee DevOps ticket submissions and ensure timely resolution of process-related issues.
  • Support data and file management processes with a specific focus on file encryption, decryption, and data security.
  • Manage, set up, and test SFTP file transfers from on-premise systems to Azure cloud environments, including IP porting to newly created Azure SFTP servers.
  • Collaborate directly with system administrators to troubleshoot and verify data quality, integrity, and security across all managed processes.
  • Submit ECM change requests to support modifications to data processing workflows.
  • Provide test, development, and production support for newly designed ROVR data functions and Azure-based data flows through Databricks.
  • Execute database script updates as needed for data processing changes and enhancements.
  • Manage and assist with Azure Data Lake storage setup, maintenance, and data hosting.
  • Support and assist with data storage and data transfer functions within Azure and Databricks environments.
  • Manage GitHub repositories as they relate to Azure Data Factory and Databricks data integration processes.
  • Conduct ongoing file and data validation activities, reviewing initial results and providing clarification to contractors regarding data requirements and architecture adjustments.
  • Assign testing and development tasks to contractors as needed and monitor completion.
  • Receive and process feedback from HSA regarding data results and communicate necessary updates or clarifications to contractors or carriers.
  • Ability to obtain a public trust clearance.
  • Must have a Bachelors Degree.
  • Minimum of 5 years of relevant experience.
  • Demonstrated experience managing large-scale data and file processes involving multiple external and internal systems.
  • Hands-on experience with SFTP file transfers, including setup, configuration, and troubleshooting in cloud and on-prem environments.
  • Experience working with Azure cloud platforms, including Data Lake, Data Factory, and Databricks.
  • Proficiency in managing Linux environments and using Ansible for process automation.
  • Understanding of data encryption, decryption, and secure file handling protocols.
  • Experience monitoring and validating data quality, performing initial checks, and resolving data transfer or processing errors.
  • Familiarity with DevOps processes, including submitting and tracking tickets for data operations.
  • Ability to coordinate across multiple stakeholders, including system administrators, contractors, and external partners, to ensure timely and accurate data delivery.
  • Experience performing database script updates and supporting changes to data processing workflows.
  • Strong analytical and troubleshooting skills to identify and resolve data processing or integration issues.
  • Proven ability to manage multiple concurrent data operations, automate repeatable processes, and maintain documentation of methodologies and data flow procedures.
  • Excellent communication and collaboration skills.
Perks:
  • Flexible, life-friendly schedules
  • 100% coverage for medical HSA plan + HSA contributions
  • Dental & vision covered 100% for you and your dependents
  • Competitive premiums for HMO/PPO and dependent coverage
  • 401(k) with 4% match & immediate vesting
  • Paid Parental Leave
  • 12 weeks paid FMLA
  • Generous PTO
  • 11 Federal Holidays
  • Birthday Holiday
  • Sick Leave
  • Up to 15 days for Jury Duty
  • Bereavement Leave
  • Education allowance
  • Wellness allowance
  • Tech allowance
  • Referral bonus: $6K–$12K for each successful referral
  • Pet insurance & discount plans
  • Employee Assistance Program (EAP)
  • Legal support
  • Life insurance
  • Disability coverage

Add these skills to join the top 1% applicants for this job

communication
problem-solving
data-analytics
github
game-texts
agile-development
linux
azure
ansible
front-end

Description

The Impact You Will Create:

This program is a mission-driven effort supporting our customer's efforts with its employer sponsored healthcare program. The team is building and maintaining a modern, secure, data-driven platform to streamline how our customer collects, integrates, and manages health benefits data for millions of federal employees, retirees, and their families. This work includes developing intuitive front-end features, integrating with complex backend systems and APIs, and automating key processes to improve efficiency and accuracy. Team members will collaborate in an agile environment, shaping a platform that is scalable, accessible, and critical to the federal benefits system.

Your Responsibilities in This Role:

  • Conduct and manage the automation of monthly data feedback loops for enrollment and provider files.
  • Manage incoming FEHB and PSHB monthly provider files, including performing initial validation checks, issuing feedback and reminder notifications for late submissions, and troubleshooting related issues.
  • Oversee and manage the bi-directional file transfer processes across multiple external systems, including USPS, SSA, RS, and NFC-CLER.
  • Manage the automation and monitoring of feedback loops for monthly provider file submissions.
  • Manage incoming pharmacy-related files (RXCU, RXMP, Formulary, FDB, Medispan, RXNORM) through SFTP file transfers and direct data pulls.
  • Perform ongoing data and file management, including monitoring for notifications, error outputs, and ensuring successful completion of file transfers and data exchanges across internal and external systems.
  • Manage Linux and Ansible environments to support file operations, automation processes, and data management tasks.
  • Oversee DevOps ticket submissions and ensure timely resolution of process-related issues.
  • Support data and file management processes with a specific focus on file encryption, decryption, and data security.
  • Manage, set up, and test SFTP file transfers from on-premise systems to Azure cloud environments, including IP porting to newly created Azure SFTP servers.
  • Collaborate directly with system administrators to troubleshoot and verify data quality, integrity, and security across all managed processes.
  • Submit ECM change requests to support modifications to data processing workflows.
  • Provide test, development, and production support for newly designed ROVR data functions and Azure-based data flows through Databricks.
  • Execute database script updates as needed for data processing changes and enhancements.
  • Manage and assist with Azure Data Lake storage setup, maintenance, and data hosting.
  • Support and assist with data storage and data transfer functions within Azure and Databricks environments.
  • Manage GitHub repositories as they relate to Azure Data Factory and Databricks data integration processes.
  • Conduct ongoing file and data validation activities, reviewing initial results and providing clarification to contractors regarding data requirements and architecture adjustments.
  • Assign testing and development tasks to contractors as needed and monitor completion.
  • Receive and process feedback from HSA regarding data results and communicate necessary updates or clarifications to contractors or carriers.

Work Experience

Skills and Qualifications We Require:

At Fearless, we seek candidates who blend technical know-how with sharp problem-solving and advisory skills to drive real impact in the communities we serve. Here are the key qualifications for this role.

  • Ability to obtain a public trust clearance
  • Must have a Bachelors Degree
  • Minimum of 5 years of relevant experience
  • Demonstrated experience managing large-scale data and file processes involving multiple external and internal systems.
  • Hands-on experience with SFTP file transfers, including setup, configuration, and troubleshooting in cloud and on-prem environments.
  • Experience working with Azure cloud platforms, including Data Lake, Data Factory, and Databricks.
  • Proficiency in managing Linux environments and using Ansible for process automation.
  • Understanding of data encryption, decryption, and secure file handling protocols.
  • Experience monitoring and validating data quality, performing initial checks, and resolving data transfer or processing errors.
  • Familiarity with DevOps processes, including submitting and tracking tickets for data operations.
  • Ability to coordinate across multiple stakeholders, including system administrators, contractors, and external partners, to ensure timely and accurate data delivery.
  • Experience performing database script updates and supporting changes to data processing workflows.
  • Strong analytical and troubleshooting skills to identify and resolve data processing or integration issues.
  • Proven ability to manage multiple concurrent data operations, automate repeatable processes, and maintain documentation of methodologies and data flow procedures.
  • Excellent communication and collaboration skills, with the ability to translate technical findings into actionable information for contractors, administrators, and business partners

Physical Requirements:

  • Ability to sit for extended periods while working on a computer or during meetings.
  • Must be able to travel occasionally to client sites or company meetings.
  • Ability to communicate effectively via phone, email, and in-person, requiring clear speech, listening, and written communication skills.
  • Ability to move within an office environment, including reaching for files, using office equipment, and occasional light lifting (up to 10 pounds).

Set alerts for more jobs like Data Analyst III
Set alerts for new jobs by Fearless
Set alerts for new Data Analysis jobs in United States
Set alerts for new jobs in United States
Set alerts for Data Analysis (Remote) jobs

Contact Us
hello@outscal.com
Made in INDIA 💛💙