Highest ROI in e-commerce? Email remarketing and retargeted ads

person-checking-email-smartphone-image-930x370

Digital marketers know they must measure and optimize all of their efforts, with the goal of increasing sales. They must also be able to prove a positive return on their investments. That said, digital marketers are constantly on the hunt for the latest technologies to help with both.

Shopping Cart Abandonment Emails Report Highest ROI

The highest ROI reported is from shopping cart abandonment emails. This shouldn’t be a surprise — 72 percent of site visitors that place items into an online shopping cart don’t make the purchase. Since they did almost purchase, cart abandoners are now your best prospects. And, a sequence of carefully timed emails will recover between 10-30 percent of them.

It’s these types of recovery rates that propel shopping cart abandonment emails to the top. They generate millions in incremental revenue for only a small effort and cost.

Retargeted Ads Complement Shopping Cart Abandonment Emails

The second most successful technique is retargeted advertising, a fantastic complement to shopping cart abandonment emails. Retargeted advertising works in a similar way, by nudging visitors to return to a website after they have left. And while retargeted advertising works across the entire funnel — from landing to purchase — the biggest opportunities lie where there is some level of intent to purchase, such as browsing category and product pages.

While the two techniques deliver a high ROI, they are definitely not the same. For example, brands using SeeWhy’s Conversion Manager to engage their shopping cart recovery emails average a 46 percent open rate and 15 percent click-through rate. Retargeted ads, by comparison, average a 0.3 percent click-through rate.

See the difference?

The real power comes when you combine the two techniques together — using retargeted advertising when no email address has been captured and email remarketing when it has.

Don’t “Set ‘Em and Forget ‘Em”

To achieve the highest possible ROI combining cart abandonment emails with retargeted advertising, you should plan to test and tune your campaigns. It’s dangerous to go live with your new campaign and then ‘set it and forget it.’ Testing and tuning your campaign can double or triple your revenues. SeeWhy tracks more than $1B in Gross Market Value ecommerce revenues annually and analyzes this data to understand what factors have the biggest impact on conversion.

A SeeWhy study of more than 650,000 individual ecommerce transactions last year concluded that the optimal time for remarketing is immediately following abandonment. Of those visitors that don’t buy, 72 percent will return and purchase within the first 12 hours.

So timing is one of the critical factors; waiting 24 hours or more means that you’re missing at least 3 out of 4 of your opportunities to drive conversions. For example, a shopping cart recovery email campaign sent by Brand A 24 hours after abandonment may be its top performing campaign. But this campaign delivers half the return of Brand B’s equivalent campaign which is real time.

Scores of new technologies and techniques will clamor for your attention, making bold claims about their ROI and conversion. But if they aren’t capable of combining shopping cart abandonment emails and retargeted ads, the two biggest ROI drivers in the industry, then they aren’t worth your time.

@JovieSylvia @ITChamps_SAP

Take a look at our website: www.itchamps.com

Advertisements

Identifying Performance Problems in ABAP Applications

SAP-ABAP-training

Analyze Statistics Records Using the Performance Monitor (Transaction STATS)

IT landscapes tend to become more complex over time as business needs evolve and new software solutions for meeting these needs emerge. This complexity can become a challenge for maintaining smooth-running business operations and for ensuring the performance and scalability of applications.

Performance means different things to different people. End users demand a reasonable response time when completing a task within a business process. IT administrators focus on achieving the required throughput while staying within their budget, and on ensuring scalability, so that a software’s resource consumption changes predictably with its load (the number of concurrent users or parallel jobs, or the number or size of processed business objects, for example). For management, optimal performance is about more productive employees and lower costs.

The overall objective is to reach the best compromise between these perspectives. This requires reliable application monitoring data, so that developers or operators can analyze an application’s response time and resource consumption, compare them with expectations, and identify potential problems. SAP customers running SAP NetWeaver Application Server (SAP NetWeaver AS) ABAP 7.4 or higher can access this data with a new, simple, user-friendly tool: the Performance Monitor (transaction STATS).

This article provides an introduction to the statistics records that contain this monitoring data for applications that run on the ABAP stack, and explains how you can use transaction STATS to analyze these records and gain the insights you need to identify applications’ performance issues.

What Are Statistics Records?

Statistics records are logs of activities performed in SAP NetWeaver AS ABAP. During the execution of any task (such as a dialog step, a background job, or an update task) by a work process in an ABAP instance, the SAP kernel automatically collects header information to identify the task, and captures various measurements, such as the task’s response time and total memory consumption. When the task ends, the gathered data is combined into a corresponding statistics record. These records are stored chronologically — initially in a memory buffer shared by all work processes of SAP NetWeaver AS ABAP. When the buffer is full, its content is flushed to a file in the application server’s file system. The collection of these statistics records is a technical feature of the ABAP runtime environment and requires no manual effort during the development or operation of an application.

The measurements in these records provide useful insights into the performance and resource consumption of the application whose execution triggered the records’ capture, including how the response times of the associated tasks are distributed over the involved components, such as the database, ABAP processing (CPU), remote function calls (RFCs), or GUI communication. A detailed analysis of this information helps developers or operators determine the next steps in the application’s performance assessment (such as the use of additional analysis tools for more targeted investigation), and identify potential optimization approaches (tuning SQL statements or optimizing ABAP coding, for example). In addition to performance monitoring, the statistics records can be used to assess the system’s health, to evaluate load tests, and to provide the basis for benchmarks and sizing calculations.

Productive SAP systems process thousands of tasks each second and create a corresponding number of statistics records, which contain valuable measurements for identifying performance issues. While existing tools provide access to the data in these records, transaction STATS offers an enhanced user interface that makes it much easier for you to select, display, and evaluate the data in statistics records, and devise a targeted plan for optimization.

Ensuring the Validity of Your Measurements

The value of a performance analysis depends on the quality of the underlying measurements. While the collection of data into the statistics records is performed autonomously by the SAP kernel, some preparatory actions are needed to ensure that the captured information accurately reflects the performance of the application.

The test scenario to be executed by the application must be set up carefully; otherwise, the scenario will not adequately represent the application’s behavior in production and will not yield the insights you need to identify the application’s performance problems. A set of test data that is representative of your productive data must be available to execute the scenario in a way that resembles everyday use. The test system you will use for the measurements must be configured and customized correctly — for example, the hardware sizing must be sufficient and the software parameterization must be appropriate, so that the system can handle the load. To obtain reliable data, you must also ensure that the test system is not under high load from concurrently running processes — for example, users should coordinate their test activities to make sure there are no negative interferences during the test run.

You must then execute the scenario a few times in the test system to fill the buffers and caches of all the involved components, such as the database cache, the application server’s table buffer, and the web browser cache. Otherwise, the measurements in the statistics records will not be reproducible, and will be impaired by one-off effects that load data into these buffers and caches. This will make it much more difficult to draw reliable conclusions — for example, buffer loads trigger requests to the database that are significantly slower than getting the data out of the buffer, and that increase the amount of transferred data. After these initial runs, you can execute the measurement run, during which the SAP kernel writes the statistics records that you will use for the analysis.

Displaying the Statistics Records

To display the statistics records that belong to the measurement run, call transaction STATS. Its start screen (see Figure 1) consists of four areas, where you specify criteria for the subset of statistics records you want to view and analyze.

 FIG 1
Figure 1 — On the STATS start screen, define filter conditions for the subset of statistics records you want to analyze, specify from where the records are retrieved, and select the layout of the data display

In the topmost area, you determine the Monitoring Interval. By default, it extends 10 minutes into the past and 1 minute into the future. Records written during this period of time are displayed if they fulfill the conditions specified in the other areas of the start screen. Adjust this interval based on the start and end times of the measurement run so that STATS shows as few unrelated records as possible.

In the Record Filter area, you define additional criteria that the records to be analyzed must meet — for example, client, user, or lower thresholds for measurement data, such as response time or memory consumption. Be as specific and restrictive as possible, so that only records relevant for your investigation will be displayed.

By default, statistics records are read from all application instances of the system. In the Configurationsection, you can change this to the local instance, or to any subset of instances within the current system. Restricting the statistics records retrieval to the instance (or instances) where the application was executed shortens the runtime of STATS. The Include Statistics Records from Memory option is selected by default, so that STATS will also process records that have not yet been flushed from the memory buffer into the file system.

Under Display Layout, select the resource you want to focus on and how the associated subset of key performance indicators (KPIs) — that is, the captured data — will be arranged in the tabular display of statistics records. The Main KPIs layouts provide an initial overview that contains the most important data and is a good starting point.

Analyzing Selected Statistics Records

Figure 2 shows the statistics record display based on the settings specified in the STATS start screen. The table lists the selected statistics records in chronological order and contains their main KPIs.

FIG 2arge

The header columns — shown with a blue background — uniquely link each record to the corresponding task that was executed by the work process. The data columns contain the KPIs that indicate the performance and resource consumption of the tasks. Measurements for times are given in milliseconds (ms) and memory consumption and data transfer are measured in kilobytes (KB).

The table of statistics records is displayed within an ALV grid control and inherits all functions of this well-known SAP GUI tool: You can sort or filter records; rearrange, include, or exclude columns; calculate totals and subtotals; or export the entire list. You can also switch to another display layout or modify the one you have chosen on the start screen. To access these and other standard functions, expand the toolbar by clicking on the Show Standard ALV Functions (Show Standard ALV Functions button) button.

The measurements most relevant for assessing performance and resource consumption are the task’sResponse Time and Total Memory Consumption. The Response Time measurement starts on the application instance when the request enters the dispatcher queue and ends when the response is returned. It does not include navigation or rendering times on the front end, or network times for data transfers between the front end and the back end. It is strictly server Response Time; the end-to-end response time experienced by the application’s user may be significantly longer. The most important contributors to serverResponse Time are Processing Time (the time it takes for the task’s ABAP statements to be handled in the work process) and DB Request Time (the time that elapses while database requests triggered by the application are processed). In most cases, Total Memory Consumption is identical to the Extended Memory Consumption, but Roll Memory, Paging Memory, or Heap Memory may also contribute to the total.

Since even the most basic statistics record contains too much data to include in a tabular display, STATS enables you to access all measurements — most notably the breakdowns of the total server Response Timeand the DB Request Time, and the individual contributions to Total Memory Consumption — of a certain record by double-clicking on any of its columns. This leads to an itemized view of the record’s measurements in a pop-up window, as shown in Figure 3. At the top, it identifies the particular statistics record via its header data. The up and down triangles (up navigation button and down navigation button) to the left of the header data support record-to-record navigation within this pop-up. The available technical data is grouped into categories, such as Time, DB, andMemory and Data. Use the tabs to navigate between categories containing data. Tabs for categories without data for the current statistics record are inactive and grayed out.

01-2016-PDMC-fig3-large

To assess the data captured in a statistics record, consider the purpose that the corresponding task serves. OLTP applications usually spend about one fourth of their server Response Time as DB Request Time and the remainder as Processing Time on the application server. For tasks that invoke synchronous RFCs or communication with SAP GUI controls on the front end, associated Roll Wait Time may also contribute significantly to server Response Time. For OLTP applications, the typical order of magnitude for Total Memory Consumption is 10,000 KB. Records that show significant upward deviations may indicate a performance problem in the application, and should be analyzed carefully using dedicated analysis tools such as transaction ST05 (Performance Trace). In comparison, OLAP applications usually create more load on the database (absolute as well as relative) and may consume more memory on the application server.

Saving Statistics

As mentioned earlier, productive systems create statistics records with a very high frequency, leading to a large volume of data that has to be stored in the application server’s file system. To limit the required storage space, the SAP kernel reorganizes statistics records that are older than the number of hours set by the profile parameter stat/max_files and aggregates them into a database table. After the reorganization, STATS can no longer display these records.

If you need to keep statistics — that is, a set of statistics records that match conditions specified on the STATS start screen — for a longer period of time for documentation reasons, reporting purposes, or before-after comparisons, you have two options:

  • Export the statistics to a binary file on your front-end PC
  • Save them into the statistics directory on the database

Both options are available via the corresponding Export Statistics to Local Front End button and Save Statistics to Database button buttons, respectively, on the STATS start screen (Figure 1) and in the tabular display of the statistics records (Figure 2).

To access and manage the statistics that were saved on the database, click on the Show Statistics Directory button button on the STATS start screen (Figure 1), which takes you to the statistics directory shown in Figure 4. In the two areas at the top of the screen, you specify conditions that the statistics must fulfill to be included in the list displayed in the lower part of the screen. Statistics are deleted automatically from the database four weeks after they have been saved. You can adjust this default Deleted On date so that the data is still available when you need it. Similarly, you can change the description, which you specified when the statistics were saved. Double-clicking on a row in the directory displays the corresponding set of statistics records, as shown in Figure 2. All capabilities described previously are available.

FIG $

Statistics that were exported to the front-end PC can be imported either into the STATS start screen, which presents the content in the tabular display shown in Figure 2, or into the statistics directory, which persists it into the database. In both cases, the import function is accessed by clicking on the Import Statistics from Local Front End button button. You can also import statistics into an SAP system that is different from the system where the statistics were exported. This enables the analysis of statistics originating from a system that is no longer accessible, or cross-system comparisons of two statistics.

Conclusion

To optimize the performance of applications and ensure their linear scalability, you need reliable data that indicates the software’s response times and resource consumption. Within the ABAP stack of SAP solutions, this data is contained in the statistics records that the SAP kernel captures automatically for every task handled by a work process. Using this information, you can identify critical steps in an application, understand which component is the bottleneck, and determine the best approach for optimization.

The Performance Monitor (transaction STATS) is a new tool available with SAP NetWeaver 7.4 for selecting, displaying, and analyzing statistics records. It helps developers and operators find the best balance between fast response times, large data throughput, high concurrency, and low hardware cost. The tool is easy and convenient to use, and employs a feature-rich UI framework so that you can focus on the data and its interpretation, and set a course for high-performing applications.

Good Advice on Giving Good Speeches

business-advice

Longtime readers of my blog know I find ideas for blogs everywhere, from psychology experiments to work events to the origin of words and phrases. Lately, however, books have become my primary source of inspiration. In fact, it’s not uncommon for me to have multiple blog ideas after reading a book.

That was the case when I read Seymour Schulich’s Get Smarter: Life and Business Lessons. Earlier this year, in How to Make Better Decisions I blogged about a simple, but practical, extension to the traditional two-column pro/con decision list. But there’s much more in the book that’s blog-worthy. For example, Schulich provides good advice on giving good speeches:

  • Be brief
  • Try to communicate one main idea
  • Create a surprise
  • Use humour
  • Slow it down
  • Use cue cards and look up often
  • Self-praise is no honour
  • Never speak before the main dinner course is served
  • Reuse good material
  • Use positive body language

Most of this advice is self-explanatory, except the self-praise line. Schulich means you should never introduce yourself; instead get someone to tell the audience why you’re important. That way they’re more likely to pay attention to you.

Considering the digital age we live in, Schulich is not really advocating we present using cue cards. Instead his goal is to ensure presenters don’t read speeches and get out from behind the podium. This forces us to give up our safety nets and increases the likelihood we connect with the audience.

While you can’t always control when you present, it’s important to recognize the most difficult slot is right before meals. No matter how good a presenter you are, remember the old adage:

Never get between people and their food.

Any presentation tips you want to add?

What is Going to be the future of Documents?

Remember when closing an agreement meant that your team had to go through each page of a paper contract with a client, have them initial or sign by hand, then scan and email it back and forth? Thanks to the emergence of e-signature software, those days are gone.

E-signatures are already having a direct impact on the productivity of companies in a variety of ways. In fact, back in 2013, Ombud Research surveyed United Healthcare and found adopting a paperless e-signature process saved the company more than $1 million in administration costs. The provider-contract turnaround was also significantly reduced, going from an average of 32.5 days to only 2.

Others are seeing benefits, too. Salesforce declared in its annual 2014 report an average savings of $20 per document after implementing electronic signing.

As more businesses realize the benefits of document automation technology, adoption rates will grow, furthering development. Business leaders who don’t adopt this technology soon will be left behind with an outdated process that impedes growth.

To keep up, here’s what’s ahead in document automation:

1. E-signatures will become fully commoditized.

Since the passing of the Electronic Signatures In Global And National Commerce (ESIGN) Act in 2000, signing all agreements on paper is no longer necessary. Electronic signatures for e-commerce agreements are legally binding and protected by the same rights as ink on paper. E-signatures are already increasing in popularity because of their convenience, and in a few years, they will be widely accepted as a transactional commodity.

As adoption grows, the demands for functionality in e-sign tools will grow, too. Signing will move beyond even some of today’s e-signature software features, like uploading a saved image of your personal signature or converting your typed name to script. Eventually, signing won’t require any typing. You’ll be able to sign with a voice command.

2. The use of enterprise automation platforms will expand.

Research from Raab Associates predicted revenue from B2B marketing automation would grow 60 percent last year, reaching $1.2 billion. The adoption of enterprise automation platforms will continue to increase as more companies experience the benefits: faster sales cycles and streamlined collaboration.

In fact, 58 percent of top-performing companies — or those where marketing contributes more than half of the sales pipeline — have already adopted marketing automation, according to a 2014 Forrester report. As marketing automation grows, businesses will be able to process more documents quickly, enabling growth.

B2B growth affects the document landscape, too. Sales is most innovative and efficient when it comes to adopting new technology. In fact, high-performing sales teams are the first to embrace new tech tools to streamline the sales process, with 44 percent using offer management tools, according to Salesforce’s 2015 State of Sales Report.

The rate at which sales grows will serve as a predictor of overall growth.

3. Document assembly will be entirely cloud-based.

Today, most sales documents are created and stored locally, either in PDFs or word processing programs. Creating and storing content in the cloud is a relatively new practice for many companies, but with the increased need to be always connected we’ll see a shift to cloud-based content, which can be accessed from any computer or mobile device.

Cloud-based office suites like Google Docs will be standard, almost entirely replacing word processing software. Compatibility will no longer be an issue, as it was with different versions of word processing documents, which will completely alter the day-to-day experience of people who work with documents. The ability to share and edit documents instantly will support tight deadlines and increase expectations for productivity.

4. Integrations will make projects seamless.

Bringing together data from separate systems that don’t otherwise talk to one another results in one complete view of the entire process. Several CRM integrations have already been developed among various document creation and storage platforms to import and keep track of customer data seamlessly.

Open API will continue to provide a vehicle for people to access and share data regardless of where or how it is stored. Extra steps of printing, signing, and scanning will be completely eliminated.

5. Processing and payment will be instant.

With the increased demand for integrations, there will be no need to upload documents into any system for approvals, payment processing, and storage; cloud-based app integrations will take care of that. Not only will it enable instant credit card transactions, but management approval will be simplified through automated requests managers can view and approve anywhere via mobile device.

Payments will be processed instantly within the document itself through integration with tools like Square andPayPal. Eventually, with the rise of virtual currencies like Bitcoin, smart documents will be able to accept payments, completely cutting out the middleman.

Once documents are processed, they’ll be automatically saved and uploaded right into the integrated cloud storage system of your choice. With a few keywords in the search bar, anyone from the team will be able to pull transaction and approval records immediately.

Even if you’re already using a document automation platform, think about areas of opportunity you could be missing. Many of the features that will be the norm in a couple of years are already available; they’re just not yet widely used. Look at how making some simple changes now might give your organization a head start on better sales efficiency.

What are some other changes you expect to see coming from document automation and e-signature software in the next few years?

PARTNER MANAGED CLOUD vs ON PREMISE DEPLOYEMENT

Posted By – Gervasis Paracka

on-premise-or-cloud-deployment

You are driving around in your old faithful car that has never failed you and you see through the corner of your eyes a bright red, shiny, flashy car and it whooshes right past you. SAP on premise deployment is your faithful car that you drive to work every day and took your wife out to your first date. The red shiny car is Partner managed cloud, efficient fast and what you need to get rid of your mid-life crisis.

Partner managed cloud provides traditional SAP solutions Hosted on an SAP partner’s private cloud on a subscription basis as compared to the high upfront capital required for an on premise deployment of SAP.

Key benefits of Partner Managed Cloud

  • Strategic : Decreased response times for changes in business processes and new requirements
  • Financial: 30% lower Total cost of ownership in a span of 5 years and an optimised positive cash flow.
  • Operational: Better utilization of resources towards better functionality of SAP than system maintenance.

The primary drivers of the low Total ownership cost are the result of the key components of Partner managed cloud.

  • Scalability: Increase or decrease of resources according to demand and proper utilization of the Sap Landscapes.
  • Flexibility: Quick deployment of changes in business processes.
  • Speed: Fast deployment of the service to customers leading to less down time.
  • Pooled Resources: Expertise is spread across multiple customers hence more profitable use of resources , also the customers do not have to invest in dedicated SAP resources

On Premise Deployments require more cash Upfront , almost 31 % of the investment in the first year while Partner Managed cloud has an evenly distributed cash flow of about 20 % per year for a period of 5 years.

Partner managed cloud services are more in tune to the needs of Small and Medium scale industries which need an efficient ERP to streamline all their business processes without having to break the bank.

Discover more on how ITChamps helps to deploy SAP on Cloud services.

SAP FICO best Practices when followed can reduce the number of issues drastically

BEST PRACTICE

  • Always have one Chart of accounts. Never complicate matters by providing for multiple COA
  • Have Minimal accounts and only augment/supplement it when there is another piece of information required for reporting
  • Copy Company Code when commencing FI Configuration
  • Rigorous period end and year end check to be in place along with postmortem analysis
  • Period end reports should be scheduled to run in background
  • Special periods are to be used only for year-end postings
  • Monitor all clearing accounts
  • Opening and Closing of Periods to be scheduled and the Period end tasks to be communicated to the Customer
  • You should advise your customer to have regular touch points with their auditors to avoid reporting changes in statutory reports
  • Ensure that manual entries are limited
  • All Generally Accepted Accounting Principles (GAAP) adjustments are to be made in the Company code from where they originate
  • Minimize the number of Adjustment entries after the Trial balances are extracted
  • Always insist on tightened workflow when it comes to approving Invoices, Payments, and Credit notes
  • GR based Invoice verification
  • Aging has to be constantly monitored
  • Automatic matching of payments to Customer/Vendor Invoices.
  • Overdue debts and its action
  • Forecast Cash receipts

www.itchamps.com