Remote Data Automation Engineer needs 5+ years of experience in building data engineering pipelines on both on premise and cloud platforms (Snowflake, Databricks)

Remote Data Automation Engineer requires:

• Strong experience coding in Python, PySpark, SQL and building automations.

• Knowledge of Cybersecurity, IT infrastructure and Software concepts.

• Knowledge of IT Asset Management, ITIL, ITSM practices will be a plus.

• 3+ years of experience using data warehousing / data lake techniques in cloud environments.

• 3+ years of developing data visualizations using Tableau, Plotly, Streamlit

• Experience with ELT/ETL tools like dbt, Airflow, Cribl, Glue, FiveTran, AirByte, etc.

• Experience on capturing incremental data changes, streaming data ingestion and stream processing.

• Experience in processes supporting data governance, data structures, metadata management.

• Solid grasp of data and analytics concepts and methodologies including data science, data engineering, and data story-telling

Remote Data Automation Engineer duties:

• Streamline the process of sourcing, organizing data (from a wide variety of data sources using Python, PySpark, SQL, Spark) and accelerating data for analysis.

• Support the data curation process by feeding the data catalog and knowledge bases.

• Create data tools for analytics and data scientist team members that assist them in building and optimizing the data products for consumption.