ServiceNow is aplatform-as-a-service, thatallowsfortheoperationofenterpriseandtechnicalmanagementsupportsystems, such asITservicemanagementandhelpdeskfunctionality. The company's core business revolves around management of "incident, problem, and change" IT operational events.
ServiceNow is seeking a Data Platform Architect for Enterprise Data Team based out of Hyderabad. This is a senior individual contributor role with hands-on exposure into solution design for the end-end data ecosystem. The data platform architect will provide technical expertise to Analytics org on various strategic programs as well as Enterprise initiatives. The solution architect will lead the tech roadmap, technical evaluations, bring in architecture value additions, best practices, provide optimal design recommendations and help in building next gen analytics.
Platform Design and Architecture:
Lead the design and development of the data platform architecture, ensuring scalability, performance, reliability, and security
Define and implement Standards for data modeling, data integration, and data lifecycle management
Well versed with modern data platform stack with end-end coverage to build large scale Data and AI solutions
Create blueprints for data pipelines, data lakes, data warehouses, and analytical systems
Provide technical leadership in choosing appropriate technologies for data processing, cloud compute, and storage solutions
Technical Solutions and Roadmap:
Influence enterprise architecture design conversations and deliver sophisticated data solutions
Work closely with leaders, data engineers, data scientists, and analysts to define and refine data platform requirements
Lead cross-functional teams to develop and integrate new data products and solutions
Understand business needs and translate them into data solutions and architecture roadmap that add value to the organization
Cloud usage and Governance:
Design and implement cloud-based solutions for data processing and storage (e.g. Azure, Snowflake, Databricks, GCP etc)
Optimize cloud resources for cost efficiency, performance, and availability
Ensure the security and compliance of data platforms, addressing regulatory and privacy concerns
Develop strategies to enforce data governance policies, ensuring data quality, consistency, and integrity across systems
Design data security measures and control access to sensitive data through role-based access and encryption
Innovation & Continuous Improvement:
Stay up-to-date with emerging technologies and trends in data architecture, big data, cloud computing, and AI
Recommend and lead initiatives to improve the performance, scalability, and efficiency of data processing and storage systems
Act as the Data Architecture subject matter expert to drive the innovation for the company
Documentation and Technical design:
Produce detailed documentation for platform architecture, data models, and data workflows
Well versed with technical design, diagrams, and documentation tools
What you need to be successful in this role:
Experience in designing and implementing end-to-end data platforms, including data lakes, data warehouses, and data integration pipelines.
Experience designing and developing low-latency and high-throughput enterprise grade data architecture ecosystem
Knowledge of relational and non-relational databases, and big data technologies (e.g., Hadoop, Spark, Kafka).
Expertise in cloud platforms Azure, Snowflake, Databricks, Github, Jenkins etc
Strong knowledge of ETL processes and tools for real-time data processing
Proficiency in building data solutions using tools like Apache Kafka, Apache Airflow, and dbt (Data Build Tool) and Python
Strong understanding of SQL and data querying best practices
Proficiency in managing and deploying solutions on cloud platforms such as Azure, Snowflake, Databricks
Experience with data encryption, privacy, and security best practices, including GDPR compliance
Excellent problem-solving and communication skills
Strong scripting skills in Python, Shell, or similar languages for automation and process optimization
Familiarity with CI/CD pipelines, version control (Git), and deployment automation tools (Jenkins, Terraform)
Familiarity with BI tools such as Tableau, Power BI, or Looker, as well as experience working with data scientists and analysts to support analytical workloads