SAP Cloud Services | SAP Cloud Solutions

SAP Success FactorsTalent management challenges are touching new heights with rising business complexities and heavy competition in the market. These challenges have mandated HR leaders to look for scalable solution platforms which can easily help in

  • HR programs measurable in financial terms
  • Delivering metrics & analytics on workforce
  • Promoting a highly engaged workforce & a high performance culture
  • Facilitating a strong self-learning culture
  • Identifying & developing future leaders

Hexaware’s HCM Center of Excellence team can quickly help you deploy SAP Success Factors HR and talent management solutions to transform your HR operations. With SAP Succcess Factors, Hexaware can easily enable you to align your core HR strategy with your business strategy.

Service Offerings

  • Advisory – Cloud Readiness, HR Process Assessment, Fit-Gap analysis
  • Implementation and Rollouts
  • Application Management Support
  • Integration
  • On-premise to Cloud Migrations

Click Here To Read More:- http://hexaware.com/sap-cloud-services.htm

Advertisements

Backup for SAP | ERP System

This Blog talks about the ways and the significance of backup.

Storage is energetic to any computer system. It is doubly vital to an ERP system which serves as the memory for the business. Lose that memory and you’re out of business.

Backups are copies of all the important data on your system taken and preserved in such a way that you can recover your data no matter what happens. Making backups and being sure you have good ones is a best business practice. Typically an SAP system will have one or more storage administrators to take care of storage and backup systems. However as manager of the SAP system it’s important that you understand the basics of backup because it is so intimately connected with running a successful SAP system

Basically, ‘backup’ refers to three different things, only one of which is truly backup. There is short-term backup, which preserves a copy of the document for a short time, typically a week or two. There is true backup which saves a copy for a year or so. Then there is archival backup which saves important data permanently – or at least for five years.

These three serve very different purposes. Short term storage is designed to protect files from corruption or deletion, usually accidental and generally within a short time of creation. It is marked by very fast restore times and is the most commonly used form. True backup is designed to keep a copy of the files available for a longer period of time for restoring data. It may last for a year or even longer. Archival storage is designed to save important data permanently. Short term storage usually relies on storage on the active disk or disks. True backup uses longer term disk storage, often on slower, less expensive disks or on tape. Archival storage is typically on magnetic tape, although hard disk arrays are becoming more common. Archival material is often moved off site to a secure facility for permanent storage. Read the rest of this entry

Performance Testing in Agile Environment

Overview

This article will help you gain a closer understanding of how you can integrate performance testing in Agile processes. Performance testing is an essential activity in all software development projects, including Agile ones. From initial planning to production analysis, application performance drives the development of better software iterations and releases. Application owners, stakeholders, developers, and testers should make performance a key consideration in every sprint, or iteration, in Agile development.

Introduction to the Approach

The approach starts by examining the software development project as a whole, the reasons why stakeholders have chosen to include performance testing in the project, and the value that performance testing is expected to add to the project. The Agile approach breaks through the barriers of conventional waterfall approaches to software development to deliver business value sooner and accelerate return on investment (ROI).

Performance testing is an integral part of Agile processes, it can help your organization develop higher quality software in less time while reducing development costs. The goal is to test performance early and often in the development effort, and to test functionality and performance in the same sprint. Performance testing is taken into consideration throughout the Agile SDLC—from the release planning stage onward. Read the rest of this entry

HP DIAGNOSTICS


Overview
Identifying and correcting availability and performance problems can be costly, time consuming and risky. IT organizations spend more time identifying an owner than resolving the problem.
HP Diagnostics helps to improve application availability and performance in pre-production and production environments. HP’s diagnostics software is used to drill down from the end user into application components and cross platform service calls to resolve the toughest problems. This includes slow services, methods, SQL, out of memory errors, threading problems and more.

How HP Diagnostics software works
During a performance test, HP Diagnostics software traces J2EE, .NET, ERP, and CRM business processes from the client side across all tiers of the infrastructure. The modules then break down eachtransactionresponse time into time spent in the various tiers and within individual components. Read the rest of this entry

Performing Manual Correlation with Dynamic Boundaries in LR

What is Correlation: It is a Process to handle dynamic values in our Script. Here the dynamic value is replaced by a variable which we assign or capture from the server response.

Ways to do correlation: There are two ways to do this Correlation.

They are as follows:

  • Auto-Correlation: The Correlation Engine in LR Package captures the value and replaces it with another value
  • Manual Correlation: Understanding of the Script and its response is highly needed to do this. It is bit complex to do Manual Correlation sometimes but this is always the preferred method to handle Dynamic Values in our Script

Usually the Manual Correlation is done by capturing the dynamic value which is present in between the Static left and right Boundaries.

Objective: The intention of this article is that to give a method which will be useful when we wanted to capture and handle Dynamic Values when even the Left and right Boundaries are also dynamic. Read the rest of this entry

HP Ajax TruClient – Overview with Tips and Tricks

Overview

  • In LoadRunner 11.5, TruClient for Internet Explorer has been introduced. It is now possible to use TruClient on IE-only web applications.

Note: This still supports only HTML + JavaScript websites. It does not support ActiveX objects or Flash or Java Applets, etc.

  • TruClient IE was developed as an add-in for IE 9, so it will not work on earlier versions of IE. This version of IE was the first version to expose enough of the DOM to be usable by a TruClient-style Vusers. Note that your web application must support IE9 in “standard mode”.
  • Some features have also been added to TruClient Firefox. These include:
    • The ability to specify think time
    • The ability to set HTTP headers
    • URL filters
    • Event handlers, which can automatically handle intermittent pop-up windows, etc.
  • Web page breakdown graphs have been added to TruClient (visible in LoadRunner Analysis). Previously they were only available for standard web Vusers. Read the rest of this entry

XML Optimization through custom Properties

1. Problem Statement:

I am creating a XML file as an output . If my source is empty, is there a way to  avoid the creation of an empty XML file?

Sample output Data with source data :


 

Case 1 : Empty Source – Creation of Minimal XML file

We have to set the following properties of an XML Target at session level under the Mapping tab.

Null Content Representation – “No Tag”

Empty String Content Representation – “No Tag”

Null Attribute Representation – “No Attribute”

Empty String Attribute Representation – “No attribute”

The Output file is as follows

Note: It generates the minimal XML and parent tag. The parent tags are shown as Unary Tag in the browser. Read the rest of this entry

Advanced Replication Setup for High availability and Performance

In my personal opinion, Oracle leads the market in Directory Product offerings (LDAP Directories). Starting from Oracle Internet Directory (OID), to the latest Oracle Unified Directory (OUD), Oracle definitely provides variety of LDAP Directory related products for integration.

With increasing demand for mobile computing and cloud computing offering, there is a need to standardize LDAP Deployments for Identification, Authentication and (sometimes) Authorization (IAA) services. With a highly scalable, highly performing, highly available, highly stable and highly secure LDAP Directory, these IAA services will be easier to integrate with applications in the cloud or for the mobile applications.

Introduction

Oracle Unified Directory (OUD) is a latest LDAP Directory offering from Oracle Corp. As mentioned in my previous post, OUD comes with three main components. They are:

  • Directory Server
  • Proxy Server
  • Replication Server

Here, Directory Server provides the main LDAP functionality (I assume you already know what an LDAP Directory Server means). Proxy server is used for to proxy LDAP requests (how?). AndReplication Server is used for replicating (copying) data from one OUD to another OUD or even to ODSEE server (we will talk more about replication in this post). You can read about my first post on OUD here. In this current article, I will write about replication server and advanced replication setup for Oracle Unified Directory. Read the rest of this entry

Transitioning to a New World – An Analytical Perspective

Recently, I had the opportunity to speak at the Silicon India Business Intelligence Conference. The topic I chose for the discussion was focused on providing the BI & Analytics perspective for companies transitioning to a new world. You can view my presentation at this link –http://bit.ly/VLDDfF

The gist of my presentation is given below:

1)      First, established the fact that the world indeed is changing by showing some statistics:

  • Data Deluge: Amount of digital data created in the world right now stands at 7 Zettabytes per annum (1 Zettabyte = 1 Trillion Terabytes)
  • Social Media: Facebook has touched 1 Billion users which makes it the 3rd largest country in the world
  • Cloud: Tremendous amount of cloud infrastructure is being created
  • Mobility: There are 4.7 billion mobile subscribers which covers 65% of world population

2)      Enterprises face a very different marketplace due to the profound changes taking place in the way people buy, sell, interact with one another, spend their leisure time etc.

3)      To ensure that BI can help business navigate the new normal, there are 3 key focus areas.

  • Remove Bottlenecks – Give business what they want
  • Enhance Intelligence
  • End to End Visibility by strengthening the fundamentals

For each of the 3 areas mentioned above, I gave some specific examples of the trends in the BI space.

1)      For Removing Bottlenecks, the impact of in-memory and columnar databases were elaborated.

2)      For enhancing intelligence, working with unstructured data and using big data techniques were discussed.

3)      For the 3rd point, the focus was on strengthening the fundamentals in the BI landscape.

Please do check out my complete presentation at http://bit.ly/VLDDfF and let me know your views.

Thanks for reading.

Collaborative Data Management – Need of the hour!

Well the topic may seem like a pretty old concept, yet a vital one in the age of Big Data, Mobile BI and the Hadoops! As per FIMA 2012 benchmark report Data Quality (DQ) still remains as the topmost priority in data management strategy:

What gets measured improves!’ But often Data Quality (DQ) initiative is a reactive strategy as opposed to being a pro-active one; consider the impact bad data could have in a financial reporting scenario – brand tarnish, loss of investor confidence.

But are the business users aware of DQ issue? A research report by ‘The Data Warehousing Institute’, suggested that more that 80% of the business managers surveyed believed that the business data was fine, but just half of their technical counterparts agreed on the same!!! Having recognized this disparity, it would be a good idea to match the dimensions of data and the business problem created due to lack of data quality.

Data Quality Dimensions – IT Perspective

 

  • Data Accuracy – the degree to which data reflects the real world
  • Data Completeness – inclusion of all relevant attributes of data
  • Data Consistency –  uniformity of data  across the enterprise
  • Data Timeliness – Is the data up-to-date?
  • Data Audit ability – Is the data reliable?

 

Business Problems – Due to Lack of Data Quality

Department/End-Users

Business Challenges

Data Quality Dimension*

Human Resources

The actual employee performance as reviewed by the manager is not in sync with the HR database, Inaccurate employee classification based on government classification groups – minorities, differently abled

Data consistency, accuracy

Marketing

Print and mailing costs associated with sending duplicate copies of promotional messages to the same customer/prospect, or sending it to the wrong address/email

Data timeliness

Customer Service

Extra call support minutes due to incomplete data with regards to customer and poorly-defined metadata for knowledge base

Data completeness

Sales

Lost sales due to lack of proper customer purchase/contact information that paralysis the organization from performing behavioral analytics

Data consistency, timeliness

‘C’ Level

Reports that drive top management decision making are not in sync with the actual operational data, getting a 360o view of the enterprise

Data consistency

Cross Functional

Sales and financial reports are not in sync with each other – typically data silos

Data consistency, audit ability

Procurement

The procurement level of commodities are different from the requirement of production resulting in excess/insufficient inventory

Data consistency, accuracy

Sales Channel

There are different representations of the same product across ecommerce sites, kiosks, stores and the product names/codes in these channels are different from those in the warehouse system. This results in delays/wrong items being shipped to the customer

Data consistency, accuracy

*Just a perspective, there could be other dimensions causing these issues too

As it is evident, data is not just an IT issue but a business issue too and requires a ‘Collaborative Data Management’ approach (including business and IT) towards ensuring quality data. The solution is multifold starting from planning, execution and sustaining a data quality strategy. Aspects such as data profiling, MDM, data governance are vital guards that helps to analyze data, get first-hand information on its quality and to maintain its quality on an on-going basis.

Collaborative Data Management – Approach

Key steps in Collaborative Data Management would be to:

  • Define and measure metrics for data with business team
  • Assess existing data for the metrics – carry out a profiling exercise with IT team
  • Implement data quality measures as a joint team
  • Enforce a data quality fire wall (MDM) to ensure correct data enters the information ecosystem as a governance process
  • Institute Data Governance and Stewardship programs to make data quality a routine and stable practice at a strategic level

This approach would ensure that the data ecosystem within a company is distilled as it involves business and IT users from each department at all hierarchy.

Thanks for reading, would appreciate your thoughts.