search
yourdomain > Racine > computer/technical > Data Engineer - Databricks & devOps

Data Engineer - Databricks & devOps

Report Ad  Whatsapp
Posted : Sunday, June 02, 2024 04:45 AM

*Data Engineer - Data bricks & devOps* *Location : Chicago* , *IL (* *Hybrid )* *W2* *Top Skills - Must Haves* Azure Devops Databricks Snowflake Azure Databricks Synapse *Python* *Snowflake* *Snopark (Snowflake's Snowpark delivers the benefits of Spark ETL).
* *Top Skills' Details* • At least 10 years of experience in developing and deploying large scale data solution in an enterprise and/or cloud environment.
• A minimum of 2 years hands-on staging and production experience in each of the following areas • Microsoft Azure Databricks • ADLS • Data Ingestion and ETL (e.
g.
ADF/Azure Synapse) • Snowflake • Documentation and Troubleshooting- Documentation and Troubleshooting • In-depth operational expertise in the following areas: • Cloud architecture best practices around operational excellence, security, reliability, performance efficiency, and cost optimization (e.
g.
Cloud Well Architected Framework) • Best practices and IT operations for dynamic, always-up, always-available services • Bachelor's degree in Computer Science or related discipline.
*Job Description* The ideal candidate will be customer-centric and deeply passionate about utilizing modern cloud architectures and technology to drive substantial business impacts across organizational boundaries.
The DevOps Engineer will provide support for the Analytics Workbench (AWB), specifically Snowflake & Databricks, catering to the Business Analytics Service (BAS) team and other analytics teams that come on board.
Their responsibilities will encompass the administration, maintenance, and provisioning of the infrastructure elements , Azure data mesh for Business Analytics Service (BAS).
A key focus of this role will involve Infrastructure as Code ( IaC ), where the candidate will implement technical solutions to automate the provisioning of compute cluster capacity, manage compute and storage resources, monitor resource utilization, and assist in migrating production interactive notebooks to more job-based compute capabilities.
*Responsibilities:* • Maintain and evolve AWB cloud-based platform capabilities (including ETL platform that will be selected to ingest data) • Collaborate with multiple internal teams (e.
g.
Cloud Foundation, Info Sec, Network Teams) to manage and maintain Analytics Workbench (e.
g.
billing, subscription levels, workspaces, Azure account hosting, audit logs, and high-level usage monitoring of platform users) • Research and prototype solutions for data connectivity and operations across various Microsoft Azure Platforms.
Responsible for creating and maintaining materials covering data ingestion methods, troubleshooting, and addressing related queries.
• Implement Databricks delegated account administration across business divisions and functions, ensuring effective management and support.
• Act as the key Databricks workspace administrator, facilitating discussions, address partner needs, and providing support and planning for new implementations.
• Develop and enhance the AWB platform in line with vendor release cycles and internal requirements (e.
g.
, data protection standards).
Regularly review systems and propose improvements.
• Collaborate proactively with software engineers and site reliability engineers to troubleshoot Databricks performance, connectivity, and security issues within the platform engineering organization and user community.
• Explore, adopt, and apply new technologies to resolve issues while adhering to company security protocols and standards.
*Required* • At least 10 years of experience in developing and deploying large scale data solution in an enterprise and/or cloud environment.
• A minimum of 2 years hands-on staging and production experience in each of the following areas • Microsoft Azure Databricks • ADLS • Data Ingestion and ETL (e.
g.
ADF/Azure Synapse) • Snowflake • Documentation and Troubleshooting- Documentation and Troubleshooting • In-depth operational expertise in the following areas: • Cloud architecture best practices around operational excellence, security, reliability, performance efficiency, and cost optimization (e.
g.
Cloud Well Architected Framework) • Best practices and IT operations for dynamic, always-up, always-available services • Bachelor's degree in Computer Science or related discipline.
*Preferred Specific Skills:* • Knowledge of data analytics technology and methodology, such as advanced analytics or machine learning • Experience with DevOps, DataOps, and/or MLOps using ADO • Working knowledge of data security such as SecuPi, and Azure native capabilities • Knowledge of IaC and automation such as Terraform and Powershell • Working experience with Azure and AWS cloud data and service offerings *Additional Skills & Qualifications* • Knowledge of data analytics technology and methodology, such as advanced analytics or machine learning • Experience with DevOps, DataOps, and/or MLOps using ADO • Working knowledge of data security such as SecuPi, and Azure native capabilities • Knowledge of IaC and automation such as Terraform and Powershell • Working experience with Azure and AWS cloud data and service offerings *Work Environment* Onsite hybrid 3 days a week Business Drivers/Customer Impact *External Communities Job Description* We have an exciting job opportunity for a DevOps Data Engineer with Databricks working for a highly reputable enterprise level Financial Services environment.
Placement Type and Rate/Labor Information Job Types: Full-time, Contract Schedule: * 8 hour shift Experience: * Azure: 5 years (Required) * AWS: 5 years (Required) * Kubernetes: 5 years (Required) * DevOps: 4 years (Required) * Databrick: 5 years (Required) * Snowflake: 5 years (Required) * Python: 5 years (Required) * Spark: 5 years (Required) * ETL: 5 years (Required) * Snopark: 2 years (Required) Work Location: In person

• Phone : NA

• Location : 720 13Th Street, North Chicago, IL

• Post ID: 9103015971


Related Ads (See all)


auburn.yourdomain.com is an interactive computer service that enables access by multiple users and should not be treated as the publisher or speaker of any information provided by another information content provider. © 2024 yourdomain.com