Database Solution Architect

6 Months ago • 10-10 Years • ₹30,00,000 LPA - ₹37,00,000 LPA

Job Description

CloudHire seeks a Database Solution Architect to build and maintain data architecture, data models, and conceptual data designs. Responsibilities include integrating multiple databases (Snowflake, Star schema, Network model), working with message buses (Kafka, IBM MQ), and designing for scale. The role requires collaborating with stakeholders, defining future-state capabilities, and ensuring implementations meet platform standards. The architect will analyze the current technology environment, design data services and pipelines, and develop CI/CD processes with automated testing. Experience with AWS, GCP, or Azure, along with various databases (Postgres, Redshift, BigQuery, etc.) and object-oriented languages (.Net, Java, Python) is essential.
Must Have:
  • 7+ years AWS (preferred), GCP, or Azure
  • 10+ years experience designing data pipelines
  • 10+ years experience with object-oriented languages
  • 10+ years experience with relational and dimensional data models
  • Experience with automated testing for data pipelines
Perks:
  • Tuition reimbursement
  • Mentorship programs
  • Online course subscriptions
  • Paid industry certifications

Add these skills to join the top 1% applicants for this job

java
mongodb
azure
aws
python
sql
hadoop
communication
agile-development
innovation
team-management

Description

CloudHire, a remote employee provider that sources globally, is seeking a talented Data Architect. Looking for an individual who can build and maintain data architecture, data models, and conceptual data designs. In this position, you will work closely with our clients and remote teams around the world to ensure we provide the best service. We are a fully remote organization, our people enjoy the benefits of a flexible work-life balance.

Responsibilities

  • Integrate multiple databases together, Snowflake schema, Star schema, Network model, and others.
  • · Work with multiple message buses, Kafka, IBM MQ to targets like Redshift, Postgres, MongoDB
  • · Discovering appropriate workloads and use the appropriate database to deliver the performance and functionality needed
  • · Adept at design and deploy for scale considering the types of requests the database must deliver on
  • · Database recovery with sequence and time constraint
  • · Collaborating directly with business and technology stakeholders to define future-state business capabilities & requirements and translating those into transitional and target state data architectures.
  • · Partnering with platform architects to ensure implementations meet published platform principles, guidelines, and standards.
  • · Analyzing the current technology environment to detect critical deficiencies and recommend solutions for improvement.
  • · Designing, implementing, and maintaining data services, interfaces, and real-time data pipelines via the practical application of existing, new, and emerging technologies and data engineering techniques
  • · Developing continuous integration and continuous deployment for data pipelines that include automated unit & integration testing
  • · Workflow management platforms like Airflow
  • · Mentoring, motivating, and supporting the team to achieve organizational objectives and goals
  • · Advocating for agile practices to increase delivery throughput.
  • · Creating, maintaining, and ensuring consistency with published development standards

Requirements

  • 7+ years with AWS (preferred), GCP or Azure
  • 10+ years of experience using standard methodologies to design, build, and support near real-time data pipelines and analytical solutions using Postgres, Redshift, BigQuery, Hadoop, Teradata, MS SQL Server, Talend, Informatica, Powerbi and/or SSIS
  • 10+ years of experience using object-oriented languages (.Net, Java, Python) to deliver data for near real-time, streaming analytics.
  • 10+ years of experience working with partners documenting business requirements and translating those requirements into relational, non-relational, and dimensional data models using Erwin
  • 10+ years of experience working on agile teams delivering data solutions
  • 10+ years of experience developing MDM solutions
  • 8+ years of experience in delivering solutions on public cloud platforms (Google Cloud preferred)
  • Experience writing automated unit, integration, and acceptance tests for data interfaces & data pipelines
  • Ability to quickly comprehend the functions and capabilities of new technologies, and identify the most appropriate use for them
  • Exceptional interpersonal skills, including teamwork, communication, and negotiation

Location : 100% Remote
Shift Timing : 3:30 PM to 11:30 PM
Salary : 30 LPA to 37 LPA ( FIXED )
Looking for Immediate Joiners

Benefits

CloudHire’s mission is to create a positive and lasting impact in the world. By increasing competitiveness and efficiency in businesses, we believe that we are a catalyst for innovation across a multitude of specialties.

CloudHire works closely with its team to help you grow as much as we grow, and value your holistic development. Employees who are most successful at CloudHire take initiative, know how to identify problems and provide solutions, and always put the Team first.

With a belief that the journey to growth and greatness is ongoing, CloudHire gives employees the opportunity to continue learning and honing their skills with programs such as tuition reimbursement; mentorship programs; lunch and learns; online course subscriptions; paid industry certifications; business resource groups; and more.

Set alerts for new jobs by cloudhire
Set alerts for new jobs in India
Contact Us
hello@outscal.com
Made in INDIA 💛💙