Our Digital Planet: Rise of The Digital Worker The New Breed of Worker

image05-730x300

British-Australian mining giant Rio Tinto has employed autonomous trucks, excavators and drills recently to create the first workerless iron ore mine in Western Australia. The drivers – if they can still be called that – work out of a remote operations centre hundreds of kilometres away, where data scientists mine data collected from the vehicle’s sensors. This dynamic, known as the ‘human and digital recombination’, is but a single step on the path to a changed workplace, as connectivity and automation drive the transition to digital on an unprecedented scale.

digital_planet_02_image1Real-time analysis, together with emerging digital technologies and intelligent digital processes, have upended the workplace as we know it; and businesses are today subject to a deep cultural shift in work organisation, culture and management mind set. The impact is a shift towards workers looking at available information as opposed to ‘explorative surgery’ measures when the damage is already done.

Human and digital recombination, cutting-edge decision making, realtime adaptation and experiment-driven design are pushing this transformation, not just in manufacturing but in every conceivable area of the workplace. And while the technology has done much to facilitate the transition to digital, the challenges are many.

Fat tags

Aside from Rio Tinto’s automated vehicles, other software-enabled, manufacturing- friendly marvels are around the corner, such as kilobyte-rich radio frequency identification (RFID) tags. Basically position finders at present, tomorrow’s tags will have so much storage capacity that they will act like transponders and actually tell people what to do.

As Siemens’ Markus Weinlander, Head of Product Management, predicted: “[RFID tags] can make a major contribution to the realisation of Industry 4.0 by acting as the eyes and ears of IT. For the first time, transponders will be able to carry additional information such as the production requirements together with their assembly plan. All of this will be readable at relatively large distances.”

These ‘fat tags’ will do more than boost automation. They will also make companies more nimble-footed and, say experts, allow small businesses to compete with the giants. According to Weinlander, the new wave of RFID rags will greatly facilitate customised products because they will contain all the essential information for small runs. “To remain competitive in today’s global market environment, many companies have to be able to produce in tiny batches without higher costs”, he said.

Other practical benefits are likely. For instance, maintenance and repair work will be made simpler, faster and more timely. As BCG Consulting points out, technicians will identify any problems with a machine from a stream of realtime data and then make repairs with the help of augmented-reality technology supplemented, if necessary, by remote guidance from off-site experts. In this way, downtime per machine will be reduced from one day to an hour or two.

digital_planet_02_image2

Digital people

In this brave new world of hyperconnectivity, the ‘digital worker’ – a data-driven individual skilled in converting information into revenue – will stand in the middle and direct traffic, as it were. As SAP put it in its D!gitalistmagazine, the digital worker will “create instant value from the vast array of real-time data.”

Instead of the traditional approach of gathering, processing, and moving data around while spending valuable time creating reports, digital workers will be forced to move towards predictive, scenario, and prognosis-based decision- making. SAP’s article goes on to explain: “The speed of information and data is driving such significant change in how and where we work that the digital worker is becoming a critical resource in decision-making, learning, productivity, and overall management of companies.”

HYPERCONNECTIVITY HAS LED US TO A NEW ERA, WHERE PETER DRUCKER’S “KNOWLEDGE WORKER” HAS COME TO AN END AND THE “DIGITAL WORKER” NOW NEEDS TO STEP UP AND CREATE INSTANT VALUE FROM THE VAST ARRAY OF REAL-TIME DATA

In organisations where data-savvy individuals may know more about what’s happening than the boss, the top-down hierarchy will be overturned. In short, everybody will be a leader in their own particular area of expertise. “The traditional management and organisational model is quickly getting outdated in the digital economy, and true leaders are changing their management approach to reflect this”, said SAP. Senior executives will have to be more visible and approachable for employees and customers alike – in short, both colleague and captain.

“[Managers] must juggle a distributed contingent workforce with digital workers who require real-time analysis, prognosis, and decision making. At the same time, they must develop the next generation of leaders who will actively take responsibility for innovation and engagement”, said SAP.

If done properly, this new collaborative workplace could reduce the complexity that bedevils most large organisations in an era of globalisation. According to the Economist Intelligence Unit, 55 percent of executives believe their organisational structure is ‘extremely’ or ‘very’ complex and 22 percent say they spend more than a quarter of their day managing complexity. More than three-quarters say they could boost productivity by at least 11 percent if they could cut complexity by half.

More jobs

But will the superconnected workplace destroy jobs? BCG Consulting thinks not. In a study of German manufacturing released in October, the think tank concluded that higher productivity actually equals higher employment at home. “As production becomes more capital intensive, the labour cost advantages of traditional low-cost locations will shrink, making it attractive for manufacturers to bring previously off-shored jobs back home”, the study predicted. “The adoption of Industry 4.0 will also allow manufacturers to create new jobs to meet the higher demand resulting from the growth of existing markets and the introduction of new products and services.”

Experts such as Ingo Ruhmann, Special Adviser on IT systems at Germany’s Federal Ministry of Education and Research, agree with this finding. “Complete automation is not realistic”, he told BCG Perspectives. “Technology will mainly increase productivity through physical and digital assistance systems, not the replacement of human labour.”

However, it will be a new kind of human labour. “The number of physically demanding or routine jobs will decrease while the number of jobs requiring flexible responses, problem solving, and customisation will increase”, Ruhmann predicts. For most employees, tomorrow’s workplace should be a lot more fun.

Advertisements

How Big Data is changing e-commerce for good

man-using-tablet-in-data-center-image-930x370

You’re going to have to get used to it: data is everywhere, we contribute to it constantly, and it has a huge impact on the retail industry. In fact, 80-90 percent  of all the data in the world was created in just the last two years.Big data is generally portrayed as having all the solutions to many of the biggest issues in retail (as long as it is mined and utilized correctly), but it still has a lot of quirks to work out.

A big chunk of the problem with big data comes down to how it’s defined. There are dozens of different takes on it, but let’s discuss it as the rapid increase of the creation of diverse and quickly transforming data through multiple channels that must be processed in innovative and strategic ways.

Global retail sales should reach $3 trillion this year, and big data has the potential to unlock a larger slice of the market share pie for retailers. How? Well, that’s the interesting part.

How Big Data Helps

Since it is defined so broadly, big data encompasses many types of information that are useful to retailers, both online and in-store. They say that you don’t know until you mine and that couldn’t be more true. We’ve all heard about the beer and diapers example in which Walmart learned that the two products were often bought together after analyzing data from in-store purchases. Retailers have many insights to gain from their customers’ purchasing habits, such as when shoppers buy the most, what they buy together, and which offers are most effective.

Retailers understand that they need data to win, as 59 percent identified a lack of consumer insights as their top data-related pain point. Big box retailers are already benefiting from big data. Walmart creates 1 million rows of transaction records each hour from a combination of in-store purchases, social data, and more. This gives the retailer access to massive customer insights to help target customers and merchandise more effectively.

How to Use Big Data

In the ever-changing world of retail, staying up-to-date is a challenge and a necessity. The top three things that retailers need to know are: how effective their pricing is, when shoppers are most active, and what items they buy together. These three key insights can be derived from big data, but consistent mining is the only way to really enjoy the benefits. I’ll break these down into pricing and shopping behavior to explore big data’s impact on each.

Pricing: Do you know the optimal price for each of your products? Unless you’re psychic, you will need to test slightly different prices to determine which one provides the sales and profit margins. Not all retailers are able to price perfectly the first time around and that “perfect” price rarely stays the same. A bathing suit in June will sell at a higher price than in December. Forecast demand based on historical sales data and price according to variables, such as seasonality and competitor prices.

Shopper behavior: Demand tends to peak during the evening and on the weekend. Retailers can alter pricing based on traffic and conversions to maximize sales and profit. Say it’s the end of the season and the retailer wants to make room in their warehouse for the upcoming season’s styles. Dropping prices when traffic is high will help products move.

On the other hand, raising prices slightly when demand is high will help pull in more profit margin from each sale. Similarly, promoting products that shoppers often buy together can boost your average order value and help products move more quickly. Automatically bundling these products can be a time-saver for busy shoppers and encourage them to buy by showing that getting all the items together is cheaper than buying them separately. Increase average order value while building loyalty.

Big data isn’t a new concept, but it is a resource to tap into sooner rather than later. All of the answers you need are there in front of you, it’s just a matter of crunching the numbers. The insights that big data offers can take your retail business from zero to 60 in no time. Mining, analyzing, and acting on data is the only way to make informed decisions to ensure the success of your retail business.

How have you used big data and what impact have you seen?

Prepare yourself—and your brand—for the creator economy

paintbrushes

When Henry Ford introduced the first assembly line for manufacturing the Model T, the price of a car dropped to $265 by 1925. This lower price threshold eliminated barriers to purchase for the average worker and signaled the start of the Industrial Era.

Industrial production became so efficient suddenly that companies found themselves in the position of needing to create demand. This demand-creation model ushered in the era of what was known as the consumer economy and the domination of mass media as a marketing tool.Consumers were suddenly bombarded with messages urging them to fill their lives and their homes with all manner of goods.

After the financial crash of 2008, Millennials, who grew up turning off the TV and ignoring web advertising, began to dominate the economy. Mass media found itself competing with social networks like YouTube, Facebook, and Twitter.

Modern consumers are savvy and highly engaged, and the demand personal attention and stellar experiences. They refuse to be passive “consumers” and actively seek out engagement, from editing Wikipedia entries, to sharing customer-service highs-and lows-on their social networks, and beyond this going for peer-to-peer services sharing their houses, cars, households items or leaving their dogs for a week-end to a host.

Welcome to the Creator Economy. Here are five guiding principles for this new era.

No 1: Change the Customer Relationship

Instead of keeping the customer under control, companies should embrace the paradox that the best way to retain customers is to set them free. They should engage with the customer on a one-to-one basis while bearing in mind that every point of engagement matters.

Each act of engagement, whether it’s a tweet, a click, a location check-in, and an IoT sensor reading can be used to not only optimize the customer experience in real-time, but to create the fundamental business value that drives business models in the creator economy. It’s all about engaging with the customer by sharing relevant content based on real-time context.

No 2: Change the Partner Relationship

The go-to-market strategy has to evolve from a traditional linear process to a multi-sided business model where transacting business with, through and on behalf-of third parties is key to the success of the business model.

Third parties see great value in those established customer relationships and recognize the service provider as a potential distribution channel for their own service offerings. For the service provider, this creates a virtuous circle inherent to the multi-sided platform opportunities. The value for end customers grows with combined offerings, customer loyalty increases, the customer base is broadened and consequently attracts more partners.

No. 3: Disrupt your own revenue models

Companies are transitioning from selling products to selling a personalized service, a specific experience or a negotiated outcome. While we are more familiar with subscriptions within the software industry, we see this model now arising in the manufacturing space with brands like Rolls Royce, Hilti, Lexmark, and even x-ray machines.

Selling an outcome becomes a new way of engaging with a customer. Rather than measuring the service delivered, instead the focus will be on gauging the actual business benefit. This requires companies to structure their business models to operate not based on what they can package, sell and measure, but rather based on what downstream benefits are generated and how much they contributed to that outcome. The focus must be on value, not revenue.

No. 4: Plan and build a secure, scalable infrastructure

Growth strategies will have implications that you have to take into consideration when considering your delivery process. Be prepared for:

  • Smooth and efficient expansion in new regions where currencies will be different, and payment habits will differ as well: improving local payments such as introducing invoicing rather than cash on delivery can be key to increase the check-out conversion rate.
  • Managing high volumes of customers and transactions that will grow by multiple orders of magnitude when moving from product to service.

No. 5: Agility should be part of your DNA

While constant efficiency improvements are a prerequisite for a healthy bottom line, it is no longer sufficient in the creator-economy landscape, which is an iterative and rapidly changing economy.

Companies have to go for simplicity, provide autonomy and embed intelligence directly into the production and business processes to help them adapt quickly to changing needs. Simplicity can take the form of a cloud delivery model, autonomy can translate into micro-services and APis to add/build new functionalities through a community of solution providers, and intelligence can be called sensors, analytics, predictive capabilities, and in memory database.

These five steps should set you on the way to become customer experience leaders in this creator-economy era.

Identifying Performance Problems in ABAP Applications

SAP-ABAP-training

Analyze Statistics Records Using the Performance Monitor (Transaction STATS)

IT landscapes tend to become more complex over time as business needs evolve and new software solutions for meeting these needs emerge. This complexity can become a challenge for maintaining smooth-running business operations and for ensuring the performance and scalability of applications.

Performance means different things to different people. End users demand a reasonable response time when completing a task within a business process. IT administrators focus on achieving the required throughput while staying within their budget, and on ensuring scalability, so that a software’s resource consumption changes predictably with its load (the number of concurrent users or parallel jobs, or the number or size of processed business objects, for example). For management, optimal performance is about more productive employees and lower costs.

The overall objective is to reach the best compromise between these perspectives. This requires reliable application monitoring data, so that developers or operators can analyze an application’s response time and resource consumption, compare them with expectations, and identify potential problems. SAP customers running SAP NetWeaver Application Server (SAP NetWeaver AS) ABAP 7.4 or higher can access this data with a new, simple, user-friendly tool: the Performance Monitor (transaction STATS).

This article provides an introduction to the statistics records that contain this monitoring data for applications that run on the ABAP stack, and explains how you can use transaction STATS to analyze these records and gain the insights you need to identify applications’ performance issues.

What Are Statistics Records?

Statistics records are logs of activities performed in SAP NetWeaver AS ABAP. During the execution of any task (such as a dialog step, a background job, or an update task) by a work process in an ABAP instance, the SAP kernel automatically collects header information to identify the task, and captures various measurements, such as the task’s response time and total memory consumption. When the task ends, the gathered data is combined into a corresponding statistics record. These records are stored chronologically — initially in a memory buffer shared by all work processes of SAP NetWeaver AS ABAP. When the buffer is full, its content is flushed to a file in the application server’s file system. The collection of these statistics records is a technical feature of the ABAP runtime environment and requires no manual effort during the development or operation of an application.

The measurements in these records provide useful insights into the performance and resource consumption of the application whose execution triggered the records’ capture, including how the response times of the associated tasks are distributed over the involved components, such as the database, ABAP processing (CPU), remote function calls (RFCs), or GUI communication. A detailed analysis of this information helps developers or operators determine the next steps in the application’s performance assessment (such as the use of additional analysis tools for more targeted investigation), and identify potential optimization approaches (tuning SQL statements or optimizing ABAP coding, for example). In addition to performance monitoring, the statistics records can be used to assess the system’s health, to evaluate load tests, and to provide the basis for benchmarks and sizing calculations.

Productive SAP systems process thousands of tasks each second and create a corresponding number of statistics records, which contain valuable measurements for identifying performance issues. While existing tools provide access to the data in these records, transaction STATS offers an enhanced user interface that makes it much easier for you to select, display, and evaluate the data in statistics records, and devise a targeted plan for optimization.

Ensuring the Validity of Your Measurements

The value of a performance analysis depends on the quality of the underlying measurements. While the collection of data into the statistics records is performed autonomously by the SAP kernel, some preparatory actions are needed to ensure that the captured information accurately reflects the performance of the application.

The test scenario to be executed by the application must be set up carefully; otherwise, the scenario will not adequately represent the application’s behavior in production and will not yield the insights you need to identify the application’s performance problems. A set of test data that is representative of your productive data must be available to execute the scenario in a way that resembles everyday use. The test system you will use for the measurements must be configured and customized correctly — for example, the hardware sizing must be sufficient and the software parameterization must be appropriate, so that the system can handle the load. To obtain reliable data, you must also ensure that the test system is not under high load from concurrently running processes — for example, users should coordinate their test activities to make sure there are no negative interferences during the test run.

You must then execute the scenario a few times in the test system to fill the buffers and caches of all the involved components, such as the database cache, the application server’s table buffer, and the web browser cache. Otherwise, the measurements in the statistics records will not be reproducible, and will be impaired by one-off effects that load data into these buffers and caches. This will make it much more difficult to draw reliable conclusions — for example, buffer loads trigger requests to the database that are significantly slower than getting the data out of the buffer, and that increase the amount of transferred data. After these initial runs, you can execute the measurement run, during which the SAP kernel writes the statistics records that you will use for the analysis.

Displaying the Statistics Records

To display the statistics records that belong to the measurement run, call transaction STATS. Its start screen (see Figure 1) consists of four areas, where you specify criteria for the subset of statistics records you want to view and analyze.

 FIG 1
Figure 1 — On the STATS start screen, define filter conditions for the subset of statistics records you want to analyze, specify from where the records are retrieved, and select the layout of the data display

In the topmost area, you determine the Monitoring Interval. By default, it extends 10 minutes into the past and 1 minute into the future. Records written during this period of time are displayed if they fulfill the conditions specified in the other areas of the start screen. Adjust this interval based on the start and end times of the measurement run so that STATS shows as few unrelated records as possible.

In the Record Filter area, you define additional criteria that the records to be analyzed must meet — for example, client, user, or lower thresholds for measurement data, such as response time or memory consumption. Be as specific and restrictive as possible, so that only records relevant for your investigation will be displayed.

By default, statistics records are read from all application instances of the system. In the Configurationsection, you can change this to the local instance, or to any subset of instances within the current system. Restricting the statistics records retrieval to the instance (or instances) where the application was executed shortens the runtime of STATS. The Include Statistics Records from Memory option is selected by default, so that STATS will also process records that have not yet been flushed from the memory buffer into the file system.

Under Display Layout, select the resource you want to focus on and how the associated subset of key performance indicators (KPIs) — that is, the captured data — will be arranged in the tabular display of statistics records. The Main KPIs layouts provide an initial overview that contains the most important data and is a good starting point.

Analyzing Selected Statistics Records

Figure 2 shows the statistics record display based on the settings specified in the STATS start screen. The table lists the selected statistics records in chronological order and contains their main KPIs.

FIG 2arge

The header columns — shown with a blue background — uniquely link each record to the corresponding task that was executed by the work process. The data columns contain the KPIs that indicate the performance and resource consumption of the tasks. Measurements for times are given in milliseconds (ms) and memory consumption and data transfer are measured in kilobytes (KB).

The table of statistics records is displayed within an ALV grid control and inherits all functions of this well-known SAP GUI tool: You can sort or filter records; rearrange, include, or exclude columns; calculate totals and subtotals; or export the entire list. You can also switch to another display layout or modify the one you have chosen on the start screen. To access these and other standard functions, expand the toolbar by clicking on the Show Standard ALV Functions (Show Standard ALV Functions button) button.

The measurements most relevant for assessing performance and resource consumption are the task’sResponse Time and Total Memory Consumption. The Response Time measurement starts on the application instance when the request enters the dispatcher queue and ends when the response is returned. It does not include navigation or rendering times on the front end, or network times for data transfers between the front end and the back end. It is strictly server Response Time; the end-to-end response time experienced by the application’s user may be significantly longer. The most important contributors to serverResponse Time are Processing Time (the time it takes for the task’s ABAP statements to be handled in the work process) and DB Request Time (the time that elapses while database requests triggered by the application are processed). In most cases, Total Memory Consumption is identical to the Extended Memory Consumption, but Roll Memory, Paging Memory, or Heap Memory may also contribute to the total.

Since even the most basic statistics record contains too much data to include in a tabular display, STATS enables you to access all measurements — most notably the breakdowns of the total server Response Timeand the DB Request Time, and the individual contributions to Total Memory Consumption — of a certain record by double-clicking on any of its columns. This leads to an itemized view of the record’s measurements in a pop-up window, as shown in Figure 3. At the top, it identifies the particular statistics record via its header data. The up and down triangles (up navigation button and down navigation button) to the left of the header data support record-to-record navigation within this pop-up. The available technical data is grouped into categories, such as Time, DB, andMemory and Data. Use the tabs to navigate between categories containing data. Tabs for categories without data for the current statistics record are inactive and grayed out.

01-2016-PDMC-fig3-large

To assess the data captured in a statistics record, consider the purpose that the corresponding task serves. OLTP applications usually spend about one fourth of their server Response Time as DB Request Time and the remainder as Processing Time on the application server. For tasks that invoke synchronous RFCs or communication with SAP GUI controls on the front end, associated Roll Wait Time may also contribute significantly to server Response Time. For OLTP applications, the typical order of magnitude for Total Memory Consumption is 10,000 KB. Records that show significant upward deviations may indicate a performance problem in the application, and should be analyzed carefully using dedicated analysis tools such as transaction ST05 (Performance Trace). In comparison, OLAP applications usually create more load on the database (absolute as well as relative) and may consume more memory on the application server.

Saving Statistics

As mentioned earlier, productive systems create statistics records with a very high frequency, leading to a large volume of data that has to be stored in the application server’s file system. To limit the required storage space, the SAP kernel reorganizes statistics records that are older than the number of hours set by the profile parameter stat/max_files and aggregates them into a database table. After the reorganization, STATS can no longer display these records.

If you need to keep statistics — that is, a set of statistics records that match conditions specified on the STATS start screen — for a longer period of time for documentation reasons, reporting purposes, or before-after comparisons, you have two options:

  • Export the statistics to a binary file on your front-end PC
  • Save them into the statistics directory on the database

Both options are available via the corresponding Export Statistics to Local Front End button and Save Statistics to Database button buttons, respectively, on the STATS start screen (Figure 1) and in the tabular display of the statistics records (Figure 2).

To access and manage the statistics that were saved on the database, click on the Show Statistics Directory button button on the STATS start screen (Figure 1), which takes you to the statistics directory shown in Figure 4. In the two areas at the top of the screen, you specify conditions that the statistics must fulfill to be included in the list displayed in the lower part of the screen. Statistics are deleted automatically from the database four weeks after they have been saved. You can adjust this default Deleted On date so that the data is still available when you need it. Similarly, you can change the description, which you specified when the statistics were saved. Double-clicking on a row in the directory displays the corresponding set of statistics records, as shown in Figure 2. All capabilities described previously are available.

FIG $

Statistics that were exported to the front-end PC can be imported either into the STATS start screen, which presents the content in the tabular display shown in Figure 2, or into the statistics directory, which persists it into the database. In both cases, the import function is accessed by clicking on the Import Statistics from Local Front End button button. You can also import statistics into an SAP system that is different from the system where the statistics were exported. This enables the analysis of statistics originating from a system that is no longer accessible, or cross-system comparisons of two statistics.

Conclusion

To optimize the performance of applications and ensure their linear scalability, you need reliable data that indicates the software’s response times and resource consumption. Within the ABAP stack of SAP solutions, this data is contained in the statistics records that the SAP kernel captures automatically for every task handled by a work process. Using this information, you can identify critical steps in an application, understand which component is the bottleneck, and determine the best approach for optimization.

The Performance Monitor (transaction STATS) is a new tool available with SAP NetWeaver 7.4 for selecting, displaying, and analyzing statistics records. It helps developers and operators find the best balance between fast response times, large data throughput, high concurrency, and low hardware cost. The tool is easy and convenient to use, and employs a feature-rich UI framework so that you can focus on the data and its interpretation, and set a course for high-performing applications.

Internet of Things & Marketing

by @Raghavendra Deshpande – Business Development Expert

5 Ways the Internet of Things Will Make Marketing Smarter

IoT

The Internet of Things (IoT) is a real technological revolution that will impact everything we do. It’s a gigantic wave of new possibility that is destined to change the face of technology as we know it.

IoT is the interconnectivity between things using wireless communication technology (each with their own unique identifiers) to connect objects, locations, animals, or people to the internet, thus allowing for the direct transmission of and seamless sharing of data.

IoT will have the enormous impact on the way we do business, specifically where marketing is concerned. Here’s 5 ways that IoT will improve marketing ROI:

  1. Easy Exchange of Sales Data
  2. Smarter CRM: Instantaneous Customer Analysis
  3. Devices That Know They’re Dying
  4. Predictive Social Media
  5. Imagine a 100% CTR (Click through Rate)

To learn more about IoT Initiatives for Marketing , visit here.

Follow Raghu on Twitter: @raghavitchamps

Follow ITChamps on Twitter: @ITChamps_SAP

ITChamps iEmpPower – Self Service Solutiong

iEmp-Logo

This is an alternate solution to the Employee and Manager Self Services portal by SAP. This product is available both as an on-premise solution as well as on Cloud.

It has a built-in User Management Engine (UME), a Workflow Engine and a synchronization module, which connects with the iEmpPower™ Add-on, installed in SAP ECC system. The whole integration is seamless to the user and gives an enhanced experience to the end users while saving huge amount of money to the customers in terms of licenses.

The best part of iEmpPower™ solution is that the system works in tandem with SAP as well as can work as a standalone system without SAP at the back-end. Connectors can also be built in future to any other back-end system, which the customer may already have. It can also be extended as a vendor portal and customer portal in future.

iEmpPower™ focuses on Self Services for Employee & Managers and also the primary activities of a Human Resource Administrator (HR Admin).

Features in iEmpPower™-

  • Personal Information
  • Work Timing 
  • Benefits and Payments 
  • Workflows
  • Travel Management System
  • Performance Management System

 Values Delivered – 

  • A key self-service benefit
  • Making employee value equal to company value
  • Build business case
  • Aligning your self-service strategy