Agriculture | AI-assisted sonar signal processing
Client: AgFlo
Year of Completion: 2023
[Context and problem]
Grain and feedstock management is crucial for pig farmers, as it ensures the well-being and growth of the animals. In large farms, monitoring and forecasting grain levels in silos present a major logistical challenge. Poor stock management can lead to losses and pressure feed mills, the main grain suppliers.
Although technologies like automatic weighing systems or LIDAR sensors exist, they remain costly and complex to install. AgFlo therefore opted for a more accessible solution using sonar to measure grain levels. However, this technology is sensitive to variations in grain surface and noise interference, complicating measurements. Baseline was engaged to develop a signal processing algorithm capable of correcting and filtering these anomalies, ensuring reliable and accurate measurements.
[Approach]
Our team developed physical models to transform raw sonar data into accurate grain height estimates. These models were then optimized with machine learning techniques, increasing measurement precision and reducing errors. Advanced smoothing and filtering algorithms were also created to eliminate residual anomalies, providing a reliable and precise measurement of grain levels.
[Results]
With our solution, AgFlo transitioned from providing unreliable trends to delivering grain-level measurements with 99% accuracy compared to real-world observed values. This increased precision has allowed AgFlo to gain the trust of its clients and establish a strong foundation for developing new services and products related to grain stock management.
Mining Industry | Development of an AI-Assisted Semantic Data Schema Mapping System
Client: NeuroMines
Year of Completion: 2023
[Context and problem]
NeuroMines develops connectors that centralize data from all of its clients' heterogeneous systems (e.g., Epicore, SAP, CMMS) into a common data model (CDM) to maximize information value. However, each new client has its own unique database schema, composed of over 6,000 attributes (~800 tables) with varied and bilingual (French/English) naming conventions. Harmonizing this data is a colossal task requiring dual expertise (knowledge of the client's schema and the CDM). This process is therefore time-consuming, prone to human error, and creates inconsistencies due to a lack of standardization among developers. With growth targeting 50 to 100 clients, this integration step is becoming a critical bottleneck that is hindering production deployment.
[Approach]
To automate this cognitive task, we developed an algorithm that analyzes the semantics of the descriptions and column names to compare them with those in the CDM. The solution instantly recommends possible matches (Top-K) for each field to integrators to facilitate mapping.
[Results]
The AI-powered approach demonstrates impressive accuracy: the correct mapping is found in the suggestions in over 90% of cases, even for complex technical attributes. The system offers near-instantaneous response times, drastically accelerating the initial mapping process. This solution allows NeuroMines to standardize its integrations and absorb the workload associated with the massive acquisition of new customers.
Mining Industry | Computer Vision with Specialized OCR
Client: LithologIQ
Year of Completion: 2025
[Context and problem]
LithologIQ offers an advanced solution for the mineralogical analysis of drill cores using hyperspectral imaging. A critical challenge for the company lies in automating the extraction of depth data recorded on physical markers—small blocks of material inserted between or on top of the cores. Essential for referencing the sampling, these markers are currently entered manually, a slow process prone to human error that delays data validation and decision-making.
The technological challenge is significant: these markers must be detected and interpreted in an environment without network connectivity, and the inference must be performed in less than 5 seconds on very high-resolution and sometimes very "dirty" images. By comparison, commercial cloud-based solutions (e.g., OpenAI) prove ineffective in this specific context, exhibiting a recognition rate of less than 10%.
[Approach]
To meet this challenge, we developed a custom computer vision pipeline optimized for local inference. Rather than processing the entire high-resolution image, which would be computationally expensive, our algorithm performs pre-detection to instantly target areas of interest containing markers. We then apply advanced image processing to neutralize visual interference such as water reflections, wood textures, or ink smear. Finally, an optical character recognition (OCR) module, coupled with post-correction logic, performs the transformation and improves the accuracy of the data extracted from the markers.
[Results]
The developed solution achieved character detection and recognition accuracy of over 90%, far exceeding expected performance under these extreme conditions. Processing time was optimized to less than 3 seconds and requires no network connectivity, thus respecting processing constraints. This automation allows LithologIQ to eliminate the bottleneck of manual data entry and ensure data reliability.
Transport & Logistics | Operational and Tactical Optimization through AI (Fleet Optimization & Decision Intelligence)
Client: Geothentic in partnership with Rio Tinto (IOC)
Year of Completion: 2024
[Context and problem]
In the mining sector, managing and planning vehicle fleets presents significant productivity and operational profitability challenges. Mining companies face complex logistical issues when operating in remote, unmapped environments, far from urban optimization solutions. Departments often work in silos, limiting visibility and coordination, which complicates resource management and scheduling. Additionally, operational constraints, shifting priorities, and operating costs make the task even more challenging.
In this context, how can routes be optimized and fleet size adjusted to meet tactical needs while achieving operational goals?
[Approach]
Our approach has allowed us to cover both the operational and tactical aspects of fleet management, far exceeding the initial objective of fleet sizing and composition. By combining and developing AI techniques and advanced algorithms, we leverage Géothentic's vehicle telemetry data (speed, orientation, location, etc.) to automatically construct hundreds of kilometers of routes reflecting real-world terrain conditions.
Intuitively, this resembles a map like Google Maps, displaying speeds and distances. From the data extracted from this map, we have designed a flexible, customized solution that, by integrating each client's specific parameters and constraints, generates optimized vehicle routes.
[Results]
This customized solution reduces travel distances by 15-25% compared to a human planner and fleet size by more than 10%. Designed to meet the specific challenges of off-road environments, the solution improves fleet profitability while ensuring planning is aligned with tactical and operational mining requirements.
Environment | Development of an intelligent system for correcting and improving the reliability of environmental sensors
Client: Revol’Air
Year of Completion: 2024
[Contexte et problématique]
In Quebec City, citizen mobilization led to the deployment of low-cost sensors (Revol'Air) to monitor pollution. However, these sensors present a major technical uncertainty: they are highly biased by weather conditions. Humidity and temperature distort the accuracy of the measurements, creating a significant discrepancy with official reference stations. Furthermore, the sensors generate a high volume of missing data and outliers, making systematic analysis difficult for citizens. Finally, local anomalies that skew the data, such as the presence of smoke from a nearby BBQ, must be identified and corrected.
[Approach]
Our team integrated Environment Canada's meteorological data with Revolv'Air sensor readings to create a system that continuously analyzes the data. In parallel, we designed an algorithmic and AI-powered correction pipeline to ensure the reliability of this data:
Smart cleaning: Automation of fault detection and removal of outliers via statistical filters.
Corrective modeling: Development of an additive AI model that isolates and corrects the influence of weather on sensor readings.
[Results]
The intervention transformed inaccurate sensors into reliable measuring instruments, now offering data quality comparable to that of nearby weather stations. This improved reliability translates into a drastic reduction in the error rate for pollution levels, from 10.33 μg/m³ to 3.53 μg/m³. This performance proves crucial during pollution peaks, significantly improving the reliability of alerts sent to the public in the event of a genuine health emergency.
Manufacturing Industry | Development of an Intelligent Planning/Scheduling System
Client: Centris Technologies for a paint manufacturing plant
Year of Completion: 2025
[Context and problem]
Centris Technologies, a systems automation specialist, is assisting a paint factory in optimizing its production planning and scheduling. The factory faces a complex scheduling challenge for its products (e.g., industrial lacquer, epoxy, acrylic) on shared equipment (e.g., tanks, reactors), compounded by a dynamic environment prone to frequent unforeseen events (e.g., urgent new orders).
The core challenge is to generate a production schedule that minimizes total cleaning time (a non-value-added operation) while simultaneously respecting multiple operational constraints (e.g., variable cleaning sequences, chemical incompatibilities, limited tank and reactor capacity, delivery deadlines, and the manufacturing of intermediate products).
[Approach]
We have developed an advanced scheduling module that relies on various AI and combinatorial optimization techniques. Instead of placing tasks sequentially, the AI module builds an overall view of the schedule and then intelligently explores winning and valid combinations that minimize cleaning times.
[Results]
The solution brings immediate added value to Centris' operations by radically increasing efficiency: it reduces planning time by 60% to 80% compared to traditional human methods. Beyond this productivity gain, AI becomes a true driver of sales agility. By instantly generating optimized schedules in response to unforeseen events, it allows sales teams to confirm order feasibility in real time and guarantee on-time delivery. To illustrate this power, we have developed an interactive demonstration based on this concrete case; feel free to contact us to try it.
Food | Development of an AI-based demand forecasting system
Client: Alfred Technologies
Year of Completion: 2025
[Context and problem]
For managers of event venues, such as sports centers (e.g., the Bell Centre) and outdoor events (e.g., Formula 1), anticipating inventory needs is a daily challenge. Determining the optimal quantity of each product to stock is crucial to avoid two costly pitfalls: stockouts (lost revenue) and overstocking (losses and waste). Demand varies significantly depending on the type of event (e.g., concert vs. hockey), the products, and promotions, making manual management prone to errors. Furthermore, the historical data available for analysis is often noisy and incomplete (e.g., untracked discounts), making modeling difficult.
[Approach]
Our intervention combined technical development and strategic support: we guided Alfred's team in structuring their analytical thinking and shared our expertise to overcome the complexity of the problem. Based on this collaboration and a rigorous methodology, our team developed an AI model designed to predict consumption by product and by event. The model was significantly enhanced by integrating key contextual variables. A major breakthrough was achieved by incorporating the impact of current promotions into the algorithms, a significant technical challenge given the lack of this information.
[Results]
In-depth data analysis and the integration of key factors have significantly improved the model's accuracy. The prediction error margin averages 8.5 units (e.g., 8 or 9 bags of chips at a full event). Such a small margin of error becomes negligible for the manager when dealing with massive sales volumes. For Alfred's clients, this translates into immediate cash flow benefits through inventory management. Future iterations aim to incorporate ticketing and weather data for outdoor events.
Health | MLOps Architecture and Production Deployment of AI Models in Ophthalmology
Client: LightX
Year of Completion: 2025
[Context and problem]
LightX, a company in the pre-commercialization phase of eye diagnostic solutions, needed to prepare the secure and reliable deployment of its AI models in a real-world clinical setting. The main challenge: controlling data drift between different clinics while implementing a robust and scalable MLOps architecture capable of supporting very large data volumes and expansion into the Canadian and American markets.
[Approach]
Baseline designed and implemented a scalable MLOps architecture on Azure ML Studio, integrating MLflow to standardize the model lifecycle. We developed the infrastructure and code necessary to deploy and manage more than 50 AI models, while also implementing an advanced monitoring system to track data quality, detect deviations, and prepare for automated model retraining.
[Results]
The models were successfully deployed in a secure and continuously monitored environment. The solution enables proactive detection of performance and data quality issues, while ensuring operational stability despite workload variability between clinics. Thanks to automated training and deployment pipelines, LightX now has a platform ready to support rapid growth and the gradual integration of new clinics.
Forestry Industry | Development of a Contextual Recommendation System for Sales
Client: PMP Solutions
Year of Completion: 2025
[Context and problem]
PMP Solutions develops software tools for wood processing plants, aiming to maximize their operational performance. A major challenge lies in the ability of salespeople to offer optimized sales options that are both realistic (for the plant) and attractive (for the customer), in order to increase company margins.
For salespeople, it is difficult to simultaneously consider the multitude of dynamic factors that influence the price and composition of an offer (e.g., order history, stock inventory, future production, transportation, price indices). The lack of effective tools and the reliance on individual salesperson knowledge lead to lost or undervalued sales.
[Approach]
Faced with volatile market prices and the need for flexibility, we have developed an intelligent system designed to manage uncertainty and the non-stationarity of prices and customer preferences. Instead of relying on extensive historical data to learn patterns, the solution focuses on a targeted context (e.g., customer preferences, current and future inventory, recent sales prices).
[Results]
Although the project is in the deployment phase, preliminary results confirm the power of this proactive approach to optimizing sales offers. The prototype is already able to recommend highly personalized sales options, including lot composition, price, and delivery times, while identifying immediate margin opportunities, such as substitution with higher-grade wood available in excess stock. To ensure optimal adoption, the solution provides sales teams with precise indicators to validate each recommendation. Currently being finalized, this tool aims to directly maximize the profitability and operational performance of factories.
Transportation Industry | Development of a RAG System
Client: SFPPN
Year of Completion: 2026
[Context and problem]
The Pointe-Noire Railway and Port Company (SFPPN), an industrial logistics hub in Sept-Îles, Quebec, provides rail and port transport of iron ore for the natural resources industry. Critical maintenance data (intervention histories, lockout/tagout procedures, work orders, and engineering plans) are scattered and poorly standardized, complicating information sharing and slowing down operations.
[Approach]
Baseline is developing an intelligent assistant using a Language Learning Model (LLM) integrated with a Resource Analysis Group (RAG) system to facilitate information retrieval within existing data. SFPPN data will be ingested into a vector database, enabling fast and contextualized responses. The assistant will be integrated into their Maximo environment, ensuring seamless information retrieval and simplified access to documents and procedures for maintenance personnel.
[Results]
The solution aims to make SFPPN maintenance more efficient, standardized, and accessible by centralizing information and enabling teams to quickly retrieve critical procedures and data. The collaboration with Baseline will accelerate the transition to smarter, more proactive maintenance management, improving the safety and productivity of rail and port operations.
Engineering | Development of a system for managing and leveraging geospatial data
Client: Tetra Tech
Year of Completion: 2023
[Context and problem]
Tetra Tech collects thousands of data points annually for its civil engineering projects, including field surveys, LIDAR sensor readings, and photographs. All of this data is georeferenced. The company struggles to maintain a comprehensive understanding of the geospatial data collected during its projects. For instance, data can be collected again for different projects at the same geographical location, resulting in additional and significant costs for the company.
[Approach]
Our team developed a solution that leverages geospatial information from the open-source SDI solution GeoNetwork, which allows for the creation of a geospatial data catalogue and facilitates searching within this catalogue through a user interface.
[Results]
Tetra Tech now has a technological solution that enables its engineering and geomatics teams to better reuse and access their data, thereby reducing the costs associated with field data collection.
Why choose Baseline
Cutting-edge expertise in AI
A pragmatic and methodological approach
End-to-end support
Concrete and measurable results
Our AI development method
Developing a high-performing and sustainable AI solution is not something that can be done on the fly. At Baseline, we follow a structured and agile approach that puts you at the heart of the process.
-1.png?width=850&height=1500&name=Banni%C3%A8re%20Baseline%20(7)-1.png)
Innovation canvas
Co-construction of the business project to align the technological vision of AI with the strategic objectives of the company.
Statement of Work
Concrete definition of requirements (functional and non-functional), constraints (technical, regulatory) and validation of the sufficiency and accessibility of the data.
Development plan
Breaking down the overall vision into development stages to balance risk management and continuous value delivery.
Prototyping
Rapid validation of the AI-based technical approach, adjustment of the trajectory and construction of a usable prototype serving as the foundation for the final system.
Application development
Building the remaining functionalities around the prototype, using proven iterative methods, and integrating with existing systems (ERP, CRM, etc.).
Deployment
Making functionalities available at scale, solidifying the infrastructure and setting up monitoring and observability tools to detect failures.
Training and knowledge transfer
Training of users and internal teams for the use and maintenance of the solution, ensuring autonomy through clear documentation.
Maintenance évolutive
Post-deployment support to ensure stability, security, adaptability, and to provide updates and integration of new features as needed.
.png?width=1500&height=450&name=Banni%C3%A8re%20Baseline%20(17).png)
.png)
.png)
.png?width=1510&height=453&name=Banni%C3%A8re%20Baseline%20(19).png)
.png)
.png?width=1500&height=450&name=Banni%C3%A8re%20Baseline%20(12).png)
.png?width=1500&height=450&name=Banni%C3%A8re%20Baseline%20(18).png)
.png)
-1.png)