Certain pieces of information need to be in place before beginning to generate a dataset:
1. Is there a timeframe the dataset contains?
We will be starting the dataset in line with Globocom's expansion plans, with the initial upload as of 30th September 2018, allowing a side-by-side validation period of 1 quarter and moving to full production as of the 1st of January 2019. The dataset will contain a further 5 years of data, ending 31st December 2024.
2. What data is required?
A basic reporting suite requires data regarding the company, its workers, their compensation and their organisational relationships. This data scope will limit any tool's capability, you may be able to carry out basic reporting and rudimentary forecasting but there will be very little in terms of delivered insights.
The more data elements that are introduced increases the capacity for analysis, so along with the basic required data, I will be including event data (covering hires, changes and terminations), recruitment data (job requisitions, candidate and application information).
In future I aim to introduce additional data elements that can assist in generating insight such as worker performance and engagement ratings
3. What granularity of data is needed?
Most reports will require a combination of current state data, date ranged records and successfully completed event records. There will be opportunities at a later date to add transactional records for several data elements, leading to process efficiency reporting.
4. Scope of growth?
As established, we will start with Globocom's 184 employees on 30th September 2018. I am aiming to emulate a few common business events over to dataset's timeframe including:
Expansion to additional region (UK, EMEA).
Organisational restructuring.
Expansion to additional region (Singapore, APAC).
Business restructuring.
Expansion of sales offices across all regions.
I have not arrived at an end headcount with any certainty, however I am aiming for around 5000.