Sometime around 2002, we started developing a web-based e-learning system – for ourselves. The system was intended to support successful preparation for audits. This is the system that we were planning to use for providing training on our standardisation procedures and documenting our training processes.
The system was created and implemented according to plan. We thought that was the end of the story, but in fact, it was just the beginning.
At that time, we were working with a not too large, yet all the more dynamically growing Hungarian company, which rapidly grew from 70 employees to 140 and then to 350 (!).
Read further
Somehow we started discussing our internal training platform, which was soon deployed. In the following years, the company shaped the system in their own image, and after two or three years it became indispensable to them.
In the mid-2010s, the Hungarian company was acquired by a strategic investor from the USA. Our system survived in the Hungarian business, but the American company already had an e-learning supplier, so there was no question of progressing further with it. However, the parent company gradually became aware of our “boutique” solution, and started to show increasing interest in it.
Another three years later, they started to validate our system, which took six months and resulted in a 98-page validation report – with only a single observation related to server configuration.
The system was then deployed throughout the organisation.
Every day, before each employee starts work in the morning, a validation is run to confirm that the respective employee has valid and appropriate training/certification for all the activities they are to perform that day. Accordingly, it is no surprise that expected availability is 99.99%.
The system is not only linked to our name as the designers and developers, but also as the operators. So we can legitimately consider ourselves a company that has the capacity to be a worldwide service provider.
The investment to cut cost
There is vast potential for savings in optimising production processes. Planning has to be conducted in any event (having regard to, for example, available warehouse inventory), so why not make planning optimised for all resources used?
For companies with complex production lines, this problem is at least partially solved: suppliers often already provide optimisation solutions for their systems.
However, for smaller businesses and more simple workflows – in which, for example, automated data collection is impossible without sensors – large-scale industrial automation is not an option, and hand-crafted solutions are needed.
Read further
Our solution, which is based on general operations research, helps in this process. The simplest way to illustrate this is through the following example.
Imagine a workshop with ten to fifteen fitters that makes advertising billboards. In the workshop, they make billboards, that is, metal frames are produced, plastic surfaces are created, painted, decorated and bonded, and electrical work is done. They perform a relatively small number of work processes, between fifty and one hundred, using simple tools.
In day-to-day production, the primary objective is to meet delivery deadlines, with the secondary objective being to operate as cost-effectively as possible. With the current, human-based process organisation, the primary goal, that is, meeting deadlines, is achieved at a rate of 95–98%, so the company’s perception is good, and it can turn a profit. However, only rough estimates are available for cost-effectiveness indicators , mostly at monthly level. Project-level follow-up calculations are likewise missing, so it is impossible to identify any potentially underpriced processes.
Planimeter has the capacity to create a web-based application that automatically schedules tasks, and this optimised scheduling is run before each new work process begins. The system is able to detect changes during the day, such as a resource change due to a machine failure, and is enabled to reschedule in real time.
Deploying such a system therefore not only fulfils the primary goals, but also meets expectations related to the secondary ones and to follow-up calculations.
As with all IT development, this also costs money. On the other hand, if, for instance, the efficiency of an equivalent capacity of 10 Million HUF in overall wages increases by only 10%, then, for example, an 8–10 Million HUF development will be recovered very quickly, in eight to ten months, and will start making a profit from the eleventh month. Accordingly, a sufficiently innovative manager will see such a capital expenditure as an investment rather than spending.
We have successfully applied the above approach at manufacturing companies and for call automation in call centres.
What will our current account balance be on the 10th of November?
For stores or store networks conducting a large number of daily atomic financial transactions (purchases), forecasting the cashflow situation is difficult.
Even the volume of purchases in itself shows a high degree of variability (weekly and monthly fluctuations), which is complemented by the actual seasonal effect of the weather and the shopping sprees associated with long weekends and holidays. It is during these very impacts that accurate estimates of the composition of card, voucher and cash payments should be provided, and preferably, forecast.
Read further
The conditions described above pose a modelling problem independent of the inflation rate. Forecasting the nominal size of revenue is further complicated by double-digit inflation, but in fact, this occurs as soon as it creeps above 5%.
However, cashflow forecasting is a challenge that can be addressed and solved using statistical methods.
Pattern analysis and learning algorithms that can be developed for available historical transaction data can be used to classify purchases and thus derive statistical indicators for each purchase category.
The time-series based statistical approach to purchasing data allows the filtering of seasonal and trend effects, and the correction of atypical sales data. All things considered, modelling results in an estimate of the revenue cashflow for every date in the future. An important consequence of statistical estimation theory is that for each estimate, the error of the estimate can also be specified. What is known as a point estimate (for example, 10 Million HUF) in effect has a low probability of being accurate, but interval estimation (for example, 9.5 Million–10.5 Million HUF) is indeed a truly informative forecast: if our modelling works well, we can be 95% sure that our cash revenue will be in the specified range.
In terms of methodology, weather forecasting requires using different formulas, but the theoretical basis is the same. As mankind has learned and become used to daring to rely on even longer-term forecasts over decades, the time is now ripe for business actors to also fall back on an ever greater number of reliable forecasts. Then they can also see what their current account balance will be on a particular future date, for example the 10th of November.
The modelling described above has been and is being conducted for financial institutions and currently for a recruitment agency.
The report is wrong again!
Operational systems can obviously provide up-to-date (or real-time) information related to the processes we have. However, from time to time snapshots are needed, for example, to meet data needs for accounting. Each such snapshot can also be called a report. Regardless of whether data links in larger systems automatically satisfy standard reporting needs, there are nonetheless tasks that also arise continuously, due to government measures, for instance, and which require regular reporting, but whose IT support has not been solved.
This is the point where – non-AI-based – automated report generation can provide support.
The essence of automated document (report) generation is that programming tools are used to create a program that is enabled to read input data (or databases), perform calculations, and generate tables or charts. The final result can be generated in any one of the Word/PDF/HTML or RTF formats.
Read further
In the report, the derivative results (such as a table) are generated at runtime, meaning that after any change in input data, the report can be regenerated in one or two minutes.
In addition to its speed, this procedure stands out with its quality: Since there is no manual data input, typos are entirely out of the question (notwithstanding the possibility that the error was already present in the input data, of course).
Conditional statements can also be formulated during processing. If, for instance, you want to use the verb form “increased” for a positive change and “decreased” for a negative change, then this choice can be integrated into the program, and the verb form that is adapted to the scenario will be entered in the report without any human intervention whatsoever. Similar statements can be made in connection with formatting needs: for example, a negative value can be shown in red (even in a table or a chart) and a positive value in green.
Reproducibility is another advantage. If a report is later found to be incorrect, the input data can be reconstructed and used as the basis to determine whether the error came from the source data or maybe the generator program needs to be modified. In the latter case, the bug can be fixed, usually with one or two hours of work, and all the reports thought to be incorrect regenerated in a matter of seconds.
Generated reports, especially in the case of HTML output, create new possibilities in dashboard development, since a management information system is ultimately a reporting interface that, regarding the thinking above, can also be created in the scope of a data science project, in a less programming intensive, and thereby more cost-effective, project.
This approach has been applied extremely successfully in several of our international projects for years. In all cases, our solutions are based on an external or even internal database that is updated regularly (usually monthly). With an unchanged reporting framework, it is easy to add new elements according to the prevailing market needs.