Job Description
Job Description
Job Description
Job Title: ETL Data Engineer (ONSITE)
Location: Washington, DC
Duration: 12 Months+
Job Description: The Enterprise Data team at client requires an ETL data engineer to support data operations for its Cloud Data Exchange. The resource will utilize native Azure tools to perform ETL, data loading and data transformation tasks.
The ETL data engineer will support the client Enterprise Data team for data curation, processing, and transformation tasks. Specifically, the ETL data engineer will be responsible for the following tasks:
Responsibilities: - Analyzes, designs, develops, implements, replicates, and supports complex enterprise data projects.
- Interfaces with other agencies, consult with and inform user departments on system requirements, advise on environment constraints and operating difficulties for the current state and advise and resolve problems using cloud solutions and develop and replicate future enhancements to District’s data systems.
- Strong knowledge of Extraction, Transformation and Loading (ETL) processes using frameworks like Azure Data Factory, Synapse, Databricks, Informatica by gathering requirements from the stake holders or analyzing the existing code and perform enhancements or new development.
- Establishing the cloud and on-premise connectivity in different systems like ADLS, ADF, Synapse, Databricks
- Hands on experience in Azure cloud services like Azure Data Factory or Azure Synapse, MSSQL Db, Azure SQL DB, Azure Data Lake Storage Gen2, Blob Storage, Python etc.
- Worked on creating end to end pipelines to load data by reading it from multiple sources or source systems and load to landing layer or SQL tables.
- Familiarity/experience with data integration and data pipeline tools (e.g., Informatica, Synapse, Apache NiFi, Apache Airflow)
- Familiarity/experience with various data formats including database specific (Oracle, SQL Server, DB2, Quickbase), text formats(CSV, XML) and Binary(Parquet, AVRO)
- Develops, standardizes and optimizes existing data workflow/pipelines adhering to best practices.
- Adhere and contribute to enterprise data governance standards by ensuring data accuracy, consistency, security, and reliability.
- Automates, monitors, alerts, and manages data pipelines and workflows.
- Analyzes and evaluates system changes to determine feasibility, provides alternative solutions, back-up, and rollback procedures.
- Works on the development of new systems, upgrades and enhancements to existing systems and ensure systems follow approved standards and consistency after the changes.
- Develops complex programs and reports in database query language.
- Familiarity/experience with data visualization tools.
- Familiarity/experience handling and securing sensitive data based on the level of the sensitivity
Responsibilities: - Demonstrates expertise in conveying technical and functional concepts for a specific technical specialty.
- Identifies improvements to project standards to achieve high quality services/ products. This is a professional position which may require subject matter expertise consistent with demanding and rare technological skills.
- May require coordination of programming activities being conducted by the application development team
- Confers with other business and technical personnel to resolve problems of intent, inaccuracy, or feasibility of computer processing and project design.
- Works with necessary personnel to determine if modifications are necessary with interested personnel to determine necessity for modifications or enhancements.
- Leverages excellent written and verbal communication skills to develop new business process and programming solutions as directed by business and technical stakeholders.
- May coordinate activities of application developers.11-
- Able to identify best practices and standards for the use of the product.
- Proven track record of hands-on technical design and code work within large complex systems.
- Proven hands-on technical work with a variety of technologies.
- Demonstrated technical expertise integrating a variety of diverse technical environments and cross-platform technologies.
- Delivers support and design for industry specific applications that require integration with statewide systems or applications.
- Interacts with executive level business users or technical experts.
- Advanced experience in the required technical subject matter.
- May function as a niche technical SME (Subject Matter Expert).
- Has proven experience across large and complex implementations and systems.
Minimum Education/Certification Requirements: - Bachelor’s degree in Information Technology or related field or equivalent experience
Skills: Skills Required / Desired Amount of Experience Strong knowledge for development of Extract-Transform-Load (ETL) processes, including end-to-end pipelines with data loading from multiple sourcesRequired15YearsAbility to gather and document requirements for data extraction, transformation and load processesRequired15YearsUnderstanding of data warehousing, data lake, business intelligence and information management concepts and standards.Required15YearsAbility to advise internal and external customers on appropriate tools and systems for complex data processing challenges.Required15YearsKnowledge and use of SQL for relational databasesRequired11YearsExperience with various data formats including database specific (Oracle, SQL, Postgres, DB2), text formats (CSV, XML) and Binary (Parquet, AVRO)Required11YearsContribute to enterprise data governance standards by ensuring accuracy, consistency, security and reliabilityRequired7YearsStrong experience with Microsoft Azure tools such as Azure Data Factory, Synapse, SQL Database, Data Lake Storage Gen2, Blob StorageRequired5YearsExperience with data integration and data pipeline tools such as Informatica PowerCenter, Apache NiFi, Apache Airflow and FMERequired5YearsExperience with visualization and reporting software such as MicroStrategy, Tableau, and Esri ArcGISHighly desired3YearsStrong communication skills, both oral and writtenRequired3YearsAbility to provide excellent customer service to external partnersRequired3YearsAbility to work independently or as part of a larger teamRequired3YearsExperience performing data functions with DatabricksHighly desired3Years
Powered by JazzHR
FQELR0DUzX
Job Tags