Primary Skills
- DataStream, ETL Fundamentals, SQL, SQL (Basic + Advanced), Python, Data Warehousing, Time Travel and Fail Safe, Snowpipe, SnowSQL, Modern Data Platform Fundamentals, PLSQL, T-SQL, Stored Procedures
Job requirements
- Design, develop, and maintain robust ELT/ETL data pipelines to load structured and semi-structured data into Snowflake. Implement data ingestion workflows using tools like Azure Data Factory, Informatica, DBT, or custom Python/SQL scripts.
- Write and optimize complex SQL queries, stored procedures, views, and UDFs within Snowflake.
- Use Snowpipe for continuous data ingestion and manage tasks, streams, and file formats for near real-time processing Optimize query performance using techniques like clustering keys, result caching, materialized views, and pruning strategies. Monitor and tune warehouse sizing and usage to balance cost and performance.
- Design and implement data models (star, snowflake, normalized, or denormalized) suitable for analytical workloads.
- Create logical and physical data models for reporting and analytics use cases.
Similar roles you might like
View all Software Engineering rolesWorkplace Services Engineer
Vilnius, Vilnius, LithuaniaFull-Time
Software Engineering
Senior Software Engineer (Python+React+Typescript+AWS)
Bengaluru, Karnataka, IndiaFull-Time
Software Engineering
More roles at Brillio-2
View company profileData Scientist at Hyderabad
Hyderabad, Andhra Pradesh, IndiaFull-Time
AI / Data Science
