Intellectsoft is a software development company delivering innovative solutions since 2007. We operate across North America, Latin America, the Nordic region, the UK, and Europe.We specialize in industries like Fintech, Healthcare, EdTech, Construction, Hospitality, and more, partnering with startups, mid-sized businesses, and Fortune 500 companies to drive growth and scalability. Our clients include Jaguar Motors, Universal Pictures, Harley-Davidson, Qualcomm, and London Stock Exchange.Together, our team delivers solutions that make a difference. Learn more at
www.intellectsoft.net Requirements
- 7+ years of experience in data architecture, engineering, and database design
- Expertise in data lakes, data warehousing, ETL/ELT processes, and big data technologies
- Strong knowledge of Apache Kafka, NiFi, Spark, PostgreSQL, ClickHouse, Apache Iceberg, and Delta Lake
- Experience with data modeling, schema design, and performance tuning for analytical workloads
- In-depth understanding of data security, governance, access control, and compliance (e.g., GDPR, SOC 2)
- Familiarity with cloud services (AWS, GCP, Azure) for data storage, processing, and orchestration
- Hands-on experience with Kubernetes, Docker, and infrastructure-as-code (Terraform, Ansible) for deployment and automation
- Ability to optimize query performance and handle large-scale distributed data processing
- Knowledge of real-time data processing and streaming architectures
- Experience with Metabase, Grafana, or other monitoring tools for data observability and analytics
- Strong problem-solving skills and ability to design highly scalable, fault-tolerant solutions
- Experience in collaborating with data scientists, engineers, and business analysts to ensure seamless data integration and usability
Responsibilities:
- Design the architecture for the open-source-based data analytics platform
- Develop scalable data models, data pipelines, and data lakes
- Ensure integration of various data sources, including Kafka, NiFi, Apache Airflow, and Spark
- Implement modern data platform components like Apache Iceberg, Delta Lake, ClickHouse, and PostgreSQL
- Define and enforce data governance, security, and compliance best practices
- Optimize data storage, access, and retrieval for performance and scalability
- Collaborate with data scientists, engineers, and business analysts to ensure platform usability
Benefits
- 35 absence days per year for work-life balance
- Udemy courses of your choice
- English courses with native-speaker
- Regular soft-skills trainings
- Excellence Сenters meetups
- Online/offline team-buildings
All applications applied through our system will be delivered directly to the advertiser and privacy of personal data of the applicant will be ensured with security.