Über
Job Title: Data Architect (10+ Years Experience)
Job Type:
Remote
Employment:
Full Time/ W2 (
NO C2C)
Job Description:
We are seeking a highly experienced
Data Architect
with 10+ years of expertise in designing, implementing, and optimizing large-scale data architectures across
cloud, on-prem, and hybrid environments
. The ideal candidate will have deep hands-on experience with
data modeling, data warehouse/lakehouse architecture, ETL/ELT design, data governance, MDM, metadata management, big data platforms, and cloud-native data solutions (Azure/AWS/GCP)
. This role requires strong analytical thinking, enterprise architecture mindset, and the ability to collaborate with cross-functional engineering, analytics, and business teams.
Responsibilities
- Architect, design, and implement
enterprise-scale data platforms
, including
data lakes, data warehouses, lakehouse architectures
, and real-time data systems. - Define and govern
data modeling standards
using
conceptual, logical, and physical models
across OLTP, OLAP, and cloud-native systems. - Design
ETL/ELT frameworks
and scalable ingestion pipelines using tools such as
Azure Data Factory, AWS Glue, dbt, Informatica, Talend, SSIS
, and
Databricks
. - Lead architecture for
cloud data ecosystems
using
Azure (ADF, ADLS, Synapse, Databricks, Fabric), AWS (S3, Redshift, Glue, EMR), or GCP (BigQuery, Dataflow, GCS)
. - Define and implement
data governance, data quality, data lineage, metadata management, and MDM frameworks
. - Collaborate with engineering teams to implement
data catalogs, business glossaries
, and standardized documentation. - Architect and support
real-time and streaming data pipelines
using
Kafka, Event Hubs, Kinesis, Pub/Sub
, or equivalent technologies. - Evaluate and select
RDBMS, NoSQL, distributed storage
, and analytics platforms (SQL Server, Oracle, PostgreSQL, MongoDB, Cassandra, Snowflake, BigQuery, Synapse). - Develop
reference architectures, blueprint designs, best practices, API integration patterns
, and data integration strategies. - Ensure data platforms meet enterprise
security, compliance, encryption, data masking
, and access control standards. - Work closely with
Data Engineers, BI Developers, Analysts, Product Teams, and Business Stakeholders
to translate business needs into scalable solutions. - Lead performance tuning, optimization, and capacity planning for large-scale data environments.
- Provide architectural leadership and guidance during
project planning, sprint planning, code reviews
, and solution delivery. - Stay current with emerging data technologies and recommend adoption strategies aligned with enterprise goals.
Required Skills:
- Strong hands-on expertise in
data modeling
(3NF, Star/Snowflake schemas, Data Vault, Dimensional Modeling). - Experience architecting systems using
Azure, AWS, or GCP
, including: - Azure:
ADLS, ADF, Synapse Analytics, Databricks, Microsoft Fabric - AWS:
Redshift, Glue, EMR, S3, Athena - GCP:
BigQuery, Dataflow, Pub/Sub - Strong expertise in
SQL, Python, PySpark, Spark SQL
, and distributed compute frameworks. - Deep understanding of
ETL/ELT methodologies
, orchestration, and pipeline design. - Knowledge of
big data ecosystems
: Hadoop, Hive, Spark, Delta Lake, Hudi, Iceberg. - Experience with modern
data warehouse/lakehouse architectures
: Snowflake, Synapse, Databricks Lakehouse. - Strong experience in
data governance, MDM, metadata management
, and data quality frameworks. - Solid understanding of
REST APIs, microservices
, and integration architecture. - Proficiency with
CI/CD pipelines
, version control (Git), DevOps tools, and automated deployment strategies. - Strong understanding of
security best practices
including IAM, RBAC, encryption, data masking, and compliance (HIPAA, GDPR, PCI). - Excellent communication, leadership, and architectural documentation skills.
Preferred Skills:
- Certifications such as:
- Azure Data Engineer / Azure Solutions Architect
- AWS Data Analytics Specialty / AWS Solutions Architect
- Google Professional Data Engineer
- Databricks Lakehouse Architect
- Experience with
Master Data Management (MDM)
platforms (Informatica MDM, Reltio, SAP MDG). - Knowledge of
BI & Analytics tools
(Power BI, Tableau, Looker, Qlik). - Experience with
AI/ML platform integration
(MLflow, SageMaker, Vertex AI).
Sprachkenntnisse
- English
Dieses Stellenangebot stammt von einer Partnerplattform von TieTalent. Klicken Sie auf „Jetzt Bewerben“, um Ihre Bewerbung direkt auf deren Website einzureichen.