At PointClickCare, the mission is to help providers deliver exceptional care, starting with their people.
The company is a leading health tech firm that is founder-led and privately held, empowering employees to innovate and shape the future of healthcare.
PointClickCare has the largest long-term and post-acute care dataset and a Marketplace of over 400 integrated partners, serving more than 30,000 provider organizations.
The company reinvests a significant percentage of its revenue back into research and development to ensure resources for innovation.
PointClickCare has been recognized by Forbes as a top private cloud company and honored as one of Canada’s Most Admired Corporate Cultures, offering flexibility, growth opportunities, and meaningful work.
Employees are encouraged to harness AI as a catalyst for creativity, productivity, and thoughtful decision-making.
Daily tasks include identifying, prioritizing, and executing tasks in the software development life cycle, working with business to iterate over software requirements, developing tools and applications, automating tasks, analyzing and debugging systems, performing validation and verification testing, reviewing work, collaborating with internal teams and vendors, ensuring software is up-to-date, and working with distributed computing systems like Apache Hudi and Trino for big data processing.
Requirements:
Candidates must have experience with distributed computing tools like Apache Hudi, Trino, Map Reduce, and other big data technologies.
Experience with distributed storage systems such as HDFS and S3 is required.
Familiarity with Hadoop, Spark, or other distributed computing systems is necessary.
An understanding of data partitioning and sharding techniques is essential.
Knowledge of distributed computing principles and their application to large-scale data processing is required.
Candidates should have experience writing clean code that performs well at scale using languages such as Java, Kotlin, C#, or Go.
Proficiency in scripting languages such as Python is necessary.
Knowledge of relational databases, including Microsoft SQL Server and MySQL, is required.
Solid experience in writing RESTful API endpoints is essential.
A strong affinity for Test-Driven Development (TDD) and working knowledge of it is required.
Proficiency in GIT is necessary.
Experience using system and performance monitoring tools like New Relic or DataDog is required.
Proven experience in Data Engineering with expertise in big data, advanced AI integration, and SaaS applications is essential.
Bonus points for experience with Agentic AI components.
Candidates should be self-starters with excellent organization, critical-thinking, and personal leadership skills.
An analytical mind with problem-solving aptitude is necessary.
A BSc/BA in Computer Science or a related degree is required.
Benefits:
Benefits start from Day 1, including a Retirement Plan with Matching.
Flexible Paid Time Off is offered.
Wellness Support Programs and Resources are available.
Parental and Caregiver Leaves are provided.
Fertility and Adoption Support is included.
Continuous Development Support Program is available for employees.
An Employee Assistance Program is offered.
There are Allyship and Inclusion Communities within the company.
Employee Recognition programs are in place, along with additional benefits.