Showing posts with label biw trainings. Show all posts
Showing posts with label biw trainings. Show all posts

Sunday, June 6, 2010

SAP Netweaver BI7.0 Online Training

SAP BI 7.0 Course Content:


• Data warehouse fundamentals
• Introduction to SAP R/3 ( ECC ) and BW
• BW architecture
• Multi dimensional Modeling
• Star Schema, BW extended Star Schema
• Administrator Work Bench
• InfoObjects ( attributes and texts and hierarchies)
• InfoCubes, Transactional ( Real-time ) InfoCubes,
• ODS ( DSO) – Standard, Transactional and Write Optimized
• PSA, Source Systems
• InfoSource
• InfoPackage
• Transfer Rules
• Update Rules
• Transformation
• Data Transfer Process
• Loading Master Data using flat files
• Loading transaction data using flat files
• ODS - Change Log, Active Table and New Table
• Managing Delta using ODS
 MultiProviders
 infoset
• Open Hub service
• Data Marts
• InfoSpokes
• Open Hub Destination
• Aggregates
• Activating the Business content
• Loading Master data from ECC
• Delta Management
• Various types of R3 Extractors
• Extraction from LO Cockpit
• Generic Extraction
• Flat file extraction
• Integrated Planning (IP)
• • SAP landscape and Transports
• • Real Time Data Acquisition
• • BEx reporting – BEx Query Designer, Calculated and Restricted Key
• Figures, Variables ( Characteristic, Formula and Text Variables ),
• Query properties, Exceptions and Conditions, BEx Analyzer,
• Workbooks, Report to Report Interface ( RRI )
• • Information Broadcasting
• Reporting Agent
• Meta data Repository


For further details mail us to: info@sapmdm.co.in

Sunday, March 29, 2009

How to transfer the Data from SAP-System to Non-SAP system without additional Cost/License

How to transfer the Data from SAP-System to Non-SAP system without additional Cost/License

Can we Transfer the Reports/Data from SAP-System to Non-SAP system without additional Cost/License?.

Yes, we can transfer using with simple FTP/SFTP.

Is it required any additional License or additional Cost?

No.

Is there any complex Coding or Configuration required?.

No, it is very simple.

How Can I transfer the Report/Data?.

Here I'm considering SAP-BW system running in UNIX operating system and Windows XP as Non-SAP System.

Source : SAP-BW System running on UNIX.

Destination : SQL-Server running on Windows XP.

Scenario:

I have data in SAP-BW System, e.g. InfoCube 0SD_C03. And 0SD_C03 is having some reports and want to transfer the reports data to SQL-Server running on Windows XP.

Else

You may need to transfer ECC Data to SQL-Server running on Windows XP. For this also you can use the same logic.

Here are the steps...

  1. Set the Variable values using fixed restrictions or Using SAP/Customer Exits variables in BW Report. (If you have any variables.)
  2. Using RSCRM_BAPI TCode schedule the report based on your requirement i.e. daily, weekly...etc. And then dump the report result into one path (Directory) in Application Server. E.g.:
    Directory: /usr/sap/BP1/DVEBMGS00/work/
    You can see this path in AL11 Tcode.
  3. Open the port in BW Application server.

  4. Create FTP/SFTP User ID in Application server level in BW System.
  5. Using Windows Script you call the Report in /usr/sap/BP1/DVEBMGS00/work/ path in BW system to SQL-Server running on Windows XP. This script you can save as a batch file and schedule it based on your requirement in Windows Server.

Sample Windows script:

image

Note: For better understanding or Changing of the above code, please contact your Windows system Administrator. This is just sample code only.

Here I’m using RSCRM_BAPI Tcode to dump the report result into Application Server. You can use your own method also. Our intension is how to transfer the data from SAP to Non-SAP System.

Surendra Reddy Surendra Kumar Reddy Koduru, is a SAP BI consultant currently working with ITC Infotech India Ltd , Bangalore, India. He has over 4+ years of experience in various BW/BI implementation/support projects.

SAP Skills Trends and Podcast - What Are Your Thoughts? Jon Reed

Hey folks - I wrote this blog entry to hopefully get some feedback from you on a recent podcast I hosted on SAP skills trends. (If you haven't heard it yet, you can click on this link to listen). But before we get into that, a bit of context:

I've been involved in some aspect of the SAP services market since 1995, and from what I have seen so far this year, this looks like the most difficult SAP consulting market I have seen to date. The challenge for SAP professionals seems to be the combination of unprecedented economic problems and the increase in offshoring of SAP skills needs - something I plan to cover in a podcast soon.

Of course, some would say that the benefit of such a downturn is that it provokes reform and innovation. I think most of us would agree that the classic SAP consulting model could use a hearty helping of both. But I do feel for the predicaments of folks who have been emailing me these days. I've been hearing from some senior consultants with pretty compelling skill sets, such as a senior FI/CO person, a senior SAP Logistics person, and a senior CRM Mobile specialist - just some quick snapshots of people who have never had as much difficulty finding new projects.

The good side, if there is one, is that the SAP market has not stopped in its tracks; there is still work to be found. There are two areas that always deserve exploration: one is skills expansion, and the second is marketing those skills. Though I would say that self-marketing is rapidly shifting into something much more useful and important: SAP community involvement. I wrote a piece on the keys to marketing yourself as an SAP consultant last summer that goes into self-marketing in more detail.

On the skills side, we can probably throw most of what we historically understood about SAP skills needs out the window and start afresh. We need new insights and practical tips on what skills are actually in use now. And we need feedback from those in the field on what they are seeing. That's a major reason why I agreed to participate in a recent SAP skills podcast that was coordinated by SAP's Ecosystem Workforce Group. This podcast was a joint venture with my site, JonERP.com, PAC, and K2 Partnering Solutions. We had folks on the podcast from different parts of the globe drawing on global research, so it really was an attempt to provide a big picture view of SAP skills demand now.

If you haven't checked out the podcast yet, or want to see a podcast timeline, check that out on Audrey Stevenson's blog. It's also on the University Alliance home page on SCN. I'm hoping that you'll check it out and share some comments with me on what you took from the podcast and if it resonates or conflicts with what you are seeing. One of the big themes of the podcast is the key to remaining marketable as an onsite consultant in the era of global outsourcing.

The skills needed to be a marketable onsite consultant and the so-called "SAP BPX skill set" seem to be more and more connected. For example, in the podcast, Peter Russo, who leads PAC's SAP Research Practice, talked about some shifts in SAP skills needs he is seeing and some new roles that are emerging. Peter talked about how this ties back into the topic of consultant quality, and bringing genuine value to customer sites. In that context, he noted the emergence of the "Business Solutions Architect"" a combination of technical and business skills, which is the result of the broadening of the SAP suite and the possibilities that SOA brings. As Peter defined it, the Business Solution Architect brings both business process know-how and technical expertise to the table. These folks are not necessarily doing hard core coding, but they are doing process orchestration and bringing management skills to the table as well.

Peter noted that this is a difficult skills profile for customers to develop - thus the need for an onsite consulting presence in this area. The Business Solutions Architect has an onsite relevance due to the importance of local experience, understanding the needs of the company in question, and having the industry know-how to properly advise such a customer. I don't know about you, but to me, this sounds a lot like the BPX skills profile we talk about frequently on this site. Another theme that Peter cited was the importance of Business Intelligence skills and the need for data transparency, competitive analysis, and KPI measurement. These skills also tie into the onsite consultant profile, and I'll be exploring this in further detail in future podcasts.

So with that said, I'm interested in hearing your feedback on these comments and/or on the themes of the podcast.

Jon Reed is the President of JonERP.com - he blogs and podcasts on SAP skills trends.

Road to Data Mining....... Githen Ronney

I am going to discuss the evolution of Data Mining from a layman’s point of view. I have this perception that Data Mining is going to be the future of BI as we switch over to Knowledge era from Information era. To understand the evolution of Data Mining we have to recapitulate the basics of Artificial Intelligence. The concept of intelligence built upon four elements: Data, Information, Knowledge and Wisdom or Intelligence. Let us unfold each component briefly,

The basic compound for intelligence is Data. Actually raw facts and numbers are known as Data.

Information is produced by assigning meaning to Data. Simply we can consider Information as “Data in context”.

Structured information about one domain is known as knowledge in that domain.

Intelligence or wisdom embodies awareness, insight, moral judgments, and principles to construct new knowledge and improve upon existing ones.

Following Bank example would illuminate the definitions:
Data: The numbers 100 or 5 without any context
Information: Principal amount of money: $100, Interest rate: 5%
Knowledge: At the end of Year I get $105 back
Intelligence: Concept of growth

We have discussed the basic components of Business Intelligence. Now we can have a look at the stage by stage evolution of Data Mining.

Evolutionary Step

Business Need

Enabling Technologies

Properties

Data Collection

(1960s)

"What was my total revenue in the last five years?"

Computers, tapes, disks

Retrospective, static data delivery

Data Access

(1980s)

"What were unit sales in New England last March?"

Relational databases (RDBMS), Structured Query Language (SQL), ODBC

Retrospective, dynamic data delivery at record level

Data Warehousing &

Decision Support

(1990s)

"What were unit sales in New England last March? Drill down to Boston."

On-line analytic processing (OLAP), multidimensional databases, data warehouses

Retrospective, dynamic data delivery at multiple levels

Data Mining

(Emerging Today)

"What’s likely to happen to Boston unit sales next month? Why?"

Advanced algorithms, multiprocessor computers, massive databases

Prospective, proactive information delivery

As we know currently we are in the switch over process to Knowledge era. Data Mining is going to be the indubitable tool in this age. Bear in Mind the tool for age of Intelligence is still under construction…………

Githen Ronney Work for MindTree as a BI consultant

Internal Storing Formats of Typical Currencies Viren Devi

Fundamentals:

Many of us are aware of how the concept of currency translation works in BW, in terms how it fetches exchange rates and where we can perform the translation. However it is unlikely to know how some of the currencies like HUF, KRW are stored in the underlying tables. This blog sheds some light on these currency related fundamentals, issues and implemented solutions.

In cases of some typical currencies like JPY, KRW, HUF etc. there is a difference in how values are stored in the SAP tables and how they are displayed to the users. This depends on the number of decimals maintained in the TCURX tables against that corresponding currency. In the case of JPY, KRW and HUF it is maintained as 0 which indicates that it is not supposed to have any decimal places after the actual value. But in BW most of the currency value fields are of data type CURR (mostly 2 decimal places). This means during the storing process, it should store the values with two decimal places which violets the above statement!

How SAP tackles this can be simply explained through the below example,

Currency: HUF

E.g.

I. Entered value in VK11 (Pricing transaction): 329186
II. Stored value in the table KONP: 3,291.86
III. Stored value in the DSO table of BW: 3291.86
IV. Displayed value to the users in reports build on the DSO: 329186

In the event that entry is not maintained for the currency in table TCURX, it is done by default to two decimal places (GBP, EUR etc). As external representation depends on the decimal places maintained against the currencies in ‘TCURX’ table, value stored and displayed is same in case of GBP, EUR.


Practical Issue : The question is how does this internal format affects us? Please see below as an example.


Example:

In the case where values are passed from a DSO to the APD by using a Query, above fundamentals are not taken care of by the system,
Flow Chart

Currency: HUF

E.g.

I. Entered value in VK11 (Pricing transaction): 329186
II. Stored value in table KONP: 3,291.86
III. Stored value in BW table of DSO1: 3291.86
IV. Stored value in the direct update DSO2, and DSO3 : 329186.00
V. Displayed value in the report based on the DSO3 : 32918600

Stored values in the directly update DSO2 look incorrect which indicates the APD is not intelligent enough to understand internal/external formats. The query reads the value and stores it to directly update DSO2 as 329186.00 considering it has 0 decimal places. In case any other data flow loops on the DSO3, it will fetch 32918600 instead of 329186.

Solution:

To resolve the issue either we need to introduce a routine in between to check decimal places in the TCURX table and divide the value in multiples of 10 or find a standard SAP function module. We actually found a standard SAP function module ‘/POSDW/CURRENCY_DECIMALS_IN’. This module divides the passed input with multiples of 10 as per the number decimal places maintained in the table.


Declaration:

DATA: wa_CURRENCY TYPE /BI0/OICURRENCY,
wa_round(16) TYPE p DECIMALS 0,
wa_result type /posdw/amount,
wa_round1 TYPE char17.

Code :

CALL FUNCTION 'ROUND'

EXPORTING
decimals = 0
input = Price

IMPORTING
output = wa_round

.
IF sy-subrc <> 0.
* MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
* WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.

WRITE wa_round TO wa_round1.

REPLACE ALL OCCURRENCES OF ',' IN wa_round1 WITH SPACE.
CONDENSE wa_round1.


wa_CURRENCY = Currency.

CALL FUNCTION '/POSDW/CURRENCY_DECIMALS_IN'

EXPORTING
PI_AMOUNT = wa_round1

IMPORTING
PE_AMOUNT = wa_result

CHANGING
pc_currency = wa_CURRENCY


* result value of the routine
RESULT = wa_round1.

All the steps before calling module /POSDW/CURRENCY_DECIMALS_IN’ are to convert data type CURR to CHAR.


Corrected output would look like below,

Currency: HUF

E.g.

I. Entered value in VK11 (Pricing transaction): 329186
II. Stored value in table KONP: 3,291.86
III. Stored value in table DSO1 of BW: 3291.86
IV. Stored value in the directly updated DSO2 : 329186.00
V. Displayed value in the report based on the directly updated DSO2 : 32918600
VI. Output of routine which includes function module : 3291.86
VII. Displayed value in the report based on the directly updated DSO3 : 329186


Note:

We faced above mentioned issue while we were on support package 8. After upgrading the package to 16, problem is resolved. This means that, it may not stand true in all the cases.

In case above routine doesn’t exist in your system; please let me know on viren.devi@capgemini.com Id.

SAP + BusinessObjects = BI Self Services? Bharath Ajendla

All of you who started reading this blog are already familiar with SAP acquisition of business objects. Its definitely an eye-catching merger that all of us as SAP practitioners come to aware of. There are several theories on why it happened such as expanding customer base, product consolidation,...etc. What ever might be the reason, the affect is very clear. More and more customers are opting for SAP Business objects as their de facto BI standard. The reason behind that flocking is simple and complex at the same time. Simple because business objects BI suite standardises existing BI infrastructure. The other reason which I say complex is that it allows business users to write their own reports. Are we talking BI self services here? Is it a simple job to make BI self services reality? Then what happens to the IT departments who are serving the business needs so far in terms of creating reports? In order to understand this little better, we need to dive into history.


Did you remember the old days of telephone systems. People used to call telephone operator and then they used to get connected to their corresponding party. I wasn't present long enough to do that but I saw that in history channel. Slowly, the operators were replaced with automation that resulted "self service telephones". Similarly, take instance of DVD rentals. Now a days we see self service kiosks. Hence, reliable level of automation leads to self services.


Now let us come back to SAP world. Self services in SAP HR world is quite famous just because it removes the burden of maintaining employee information from HR dept and gives it back to employees. Does this mean that HR department vanished? Their headcount might have decreased but they can focus on more strategic issues unlike ever before. But this didn't happen in a single day. First the backbone of HR became reliable and stable enough to support this transformation. Once the Org management, time, payroll areas have stabilized as products then self services are on full swing.


Similarly, we have all kinds of self services such as manager, supplier, and customer self services which save money & time. So, what is the next thing in line. The answer may be business intelligence. Based on what I read the business objects tool set allows business users create their own reports. In order to make that happen the underlying framework needs to be robust enough accommodate all the needs as well as reliable. Also, the end users tools needs to be as good as iPhone gadgets. I think the merger of SAP with Business objects accelerated this much desired change. The IT department needs to do is to embrace this change. They take a back seat and provide the services that enable business users get what they want in a way that they desire.


I look forward to sharing how SAP BusinessObjects tools enable the BI self services in companies and corresponding challenges in future articles.

Bharath Ajendla is a NetWeaver Architect works for Capgemini

Faster Universe-Based Access To BW via MDX Thomas Zure

Faster Universe-Based Access To BW via MDX
Thomas Zure

Business Objects's semantic layer can expose cubes as OLAP universes to client tools like WebIntelligence (or WebI for short). This applies to OLAP servers like Microsoft's Analysis Services or Hyperion's Essbase but also to SAP BW. There is a component called OLAP Data Access (ODA) that retrieves data via MDX - a query language for multi-dimensional data sources. In the case of BW, the ODA component connects to BW's OLAP BAPI. This connection has now been optimized and streamlined in order to improve the interoperability. This blog gives some insight in what has been done and what the effects are. It complements the blog on the two options to access BW data via universes.

Figure 1 shows the improvements that have been implemented with BW 7.01 (aka NW BI 7.0 EhP1). On the left hand side, there is a new option based on the Data Federator and leading to relational universes. Further information can be found in this blog. On the right hand side, the existing option based on the ODA connecting to the OLAP BAPI is shown. It has existed for many years and also for earlier releases of BW. However, with BW 7.01 the interoperability between ODA and the OLAP BAPI has been streamlined leading to better performance and less resource consumption - see the discussion on figure 3 below. Please note: for simplicity, non-universe based query access via BICS (for BEX or Pioneer), ODBO (e.g. for native Excel) or XML/A is not shown in figure 1. This does not indicate or imply that they will go away.


Figure 1: Universe-Based Access Options with BW 7.01

The interoperability has been improved in the following three areas, also visualized in figure 2:

  1. Avoid unnecessary sorting:
    MDX standard imposes results to be sorted. However, those sorts are typically ignored or not required by WebI as it sorts the results on its own anyway. Consequently, sorting inside the OLAP BAPI is redundant and can now be skipped by using a new UNORDER() function in SAP's MDX.
  2. Leaner memory consumption during flattening:
    A new API has been created that uses a new, non-MDX-standard compliant result structure. This new API is private to ODA as it is tailored towards ODA. The MDX standard imposes a cell-based result in which each cell carries a number of properties. A lot of that information has been neglected by the ODA, similar to the sort order discussed in item 1. This means that the new API has a much slimmer result structure - thus (a) consuming less memory and (b) requiring less processing. This has led to an optimized flattening algorithm, now on the API side.
  3. Leaner communication ODA – OLAP BAPI:
    The communication between the ODA and the new API has been switched to a compressed data exchange via binary basXML - an optimized SAP-proprietary format. This reduces (a) memory consumption and (b) communication costs.
In order to implement these changes w/o affecting the certified partner tools that are based on the OLAP BAPI - go here and search for BW-OBI certifications - a new, so called, rowset API has been created. This API is not part of the OLAP BAPI but is private and proprietary to ODA as it is tailored towards the ODA use case.


Figure 2: New Interoperability via ODA

Figure 3 shows the effects of those improvements with respect to query runtime performance (red line) and memory consumption (blue line). Runtime and memory consumption values have been compared for a number of real-world customer queries defined in WebI. The vertical axis shows the runtime in 7.01 (i.e. leveraging the improvements described above) relative to the runtime in a BW 7.00 system: a value of 60% means that the runtime in 7.01 is only 60% the runtime in 7.00, i.e. it went down by 40%. The memory consumption is calculated in the same way. On the horizontal axis, the test queries are lined up. They have been sorted in a way that queries to the left have shown the most runtime improvements (i.e. red line is low) while the queries to the right have improved only slightly (i.e. red line closer to 100%). On average, the runtime has gone down by 30% while memory consumption has been reduced by over 50% with the used set of WebI queries.


Figure 3: Performance and Resource Consumption Improvements with BW 7.01

The technical prerequisites for leveraging the improvements are the following:

  • SAP NetWeaver BW 7.01 SPS3
  • Business Objects Enterprise XI 3.0 with
    • Fix Pack 2 w/ LAFix2.1
    • SAP Integration Kit Fix Pack 2 (+ LaFix 2.1)

Thomas Zurek is a development manager for BI in SAP NetWeaver

Calculating Number of Working Days in Query Level Baris Gunes

Calculating number of working days between two dates(e.g.YTD- From 1st January to Current Day) and using it for calculations in queries is a very general case for reporting models.

This blog gives a solution for these type of calculations using factory calendars maintained in BW. To see the available factory calendars TFACD (Factory Calendar definitions) can be used Details of Factory Calendars can be found in TFACS.

There are several function modules using these factory calendars and helping for necessary calculations.(*) One of them is DATE_CONVERT_TO_FACTORYDATE. This function module returns the factory date for the date and factory calendar passed. Double use of this FM (one with start date and one with the end date) will give two numbers and the difference gives the total number of working days between. By using this FM number of working days between two dates can be easily determined.Once you have this function module, it becomes very easy to include this calculation in query level by creating a formula variable(customer exit) and using DATE_CONVERT_TO_FACTORYDATE with the desired time period dates.

As an example; the steps below describes the “number of worked days in a month till today” calculation using this strategy.

- Create a formula in query designer and double-click on it.

- Right Click on Formula Variables Folder in Available Operands Panel

- Create a formula variable “ZWRKDDYS”.

- Select “Customer Exit” for “Processing By” property.

- Go to Transaction CMOD and open project RSR0001

- Insert the following code to the project.

if ( i_vnam eq 'ZWRKDDAYS' ).
data: end_date type d,
prev_wrk_day type sy-datum,
start_date type d,
bgn_of_month type sy-datum,
num_of_days type d.
concatenate sy-datum+0(6) '01' into bgn_of_month.
prev_wrk_day = sy-datum - 1.
CALL FUNCTION 'DATE_CONVERT_TO_FACTORYDATE'
EXPORTING
CORRECT_OPTION = '-'
DATE = prev_wrk_day
FACTORY_CALENDAR_ID = '00'
IMPORTING
FACTORYDATE = end_date.
.

CALL FUNCTION 'DATE_CONVERT_TO_FACTORYDATE'
EXPORTING
CORRECT_OPTION = '+'
DATE = bgn_of_month
FACTORY_CALENDAR_ID = '00'
IMPORTING
FACTORYDATE = start_date.
.
num_of_days = end_date - start_date.
e_t_data-sign = 'I'.
e_t_data-opt = 'EQ'.
e_t_data-low = num_of_days.
append e_t_data to e_t_range.
endif.

- Save and Activate.

- In the formula the number of working days till today can be seen and used for other calculated key figures.

Assumptions:

There is only one factory calendar with id = ‘00’.

Today is not included in calculation.

Project “RSR0001” is already activated in CMOD.

* For all available function modules please check the link.

Baris Gunes is a BI consultant at Accenture Istanbul Office.

How to trigger the Process Chains in BW from ECC Surendra Reddy

Can we trigger Process Chains/Start a job in BW from ECC?.

We can trigger by using a simple Function Module in ECC.

When we need this Logic:

We need this kind of Programs/Logic, in the following cases….

After performing some job in ECC, and once it is executed successfully (after successful execution, ECC program will trigger the Event) then immediately you want to Load the data in BW system using Process Chains.

E.g.

  • You have some dependencies in ECC after completing that tasks then you want to trigger some job in BW.
  • You have Y/Z – Tables in ECC and it will update everyday by using one program, after the table is uploaded then immediately that program will trigger event then, Process chain will start in BW and data will load into InfoCube/DSO.

Scenario:

Every week (some times it will vary) Marketing executives will collect the sales information from dealers (the sales details between Dealer and Customers) and then using one ABAP program they will upload the data into Y/Z-Table in ECC. At the end of the upload program this Event program is inserted, so based on the data upload i.e. if it is success then the event will trigger and then Data Loads will starts in BW through Process Chain. So immediately the users will see the related reports in BW.

Zupload_Program.


IF Data upload is Success then
Raise Event through ZEVENT_ECC_BW Program.
Else
Send a mail to User “Error in Data upload ”.
Endif.

Steps:
Create Event in BW System using TCODE : SM62

image

image

Create a Program in SE38 in ECC:

image

Code:

image

In BW system, Goto RSPC and create Process Chain..and give Event name and Parameter Name...

image

image

And save the Process chain, Activate and Execute the Process Chain. If you don’t execute the process chain, it won’t trigger even though you execute the Program in ECC.

Then execute the Program in ECC and check the Log in BW in RSPC, your process chain will trigger.

Surendra Reddy Surendra Kumar Reddy Koduru, is a SAP BI consultant currently working with ITC Infotech India Ltd , Bangalore, India. He has over 4+ years of experience in various BW/BI implementation/support projects.

Asscociation Rule Learning – First step to Data Mining. Githen Ronney

This blog is for all those Data Mining and algorithm freaks

Generally Data Mining can be done in four ways ::

  1. Associative Rule Learning :- Searches for relationships between variables. For example a supermarket might gather data of what each customer buys. Using association rule learning, the supermarket can work out what products are frequently bought together, which is useful for marketing purposes. This is sometimes referred to as "market basket analysis".
  2. Regression :- Attempts to find a function which models the data with the least error. A common method is to use Genetic Programming.
  3. Classification :- Arranges the data into predefined groups. For example an email program might attempt to classify an email as legitimate or spam. Common algorithms include Nearest neighbor, Naive Bayes classifier and Neural network.
  4. Clustering :- Is like classification but the groups are not predefined, so the algorithm will try to group similar items together.

The above definitions are taken from Wikipedia.com. In this blog I mainly concentrate on Associative Rule learning (Market Basket Analysis) and Apriori algorithm.

Associative Rule Learning (Market Basket Analysis ) :-

Market Basket Analysis (MBA) is a technique that helps you to identify the relationship between pairs of products purchased together. The prime aim of this technique is to identify cross-selling opportunities. Retail business is getting a lot of benefits from MBA.

We can discuss MBA through an example. Using MBA you might unfold the fact that customers tend to buy bread and butter together. Based on this information you might organize your store so that bread and butter are next to each other. Now let us discuss the important measures in MBA.

Frequency :-

The number of times two products were purchased together. Suppose bread and butter found together in 1700 baskets then 1700 would be its frequency.

Support :-

Support = frequency / Total number of orders.

If 1700 bread and butter were purchased together and your store had 2000 orders the support for this would be calculated as 1700/2000 = 85%

Confidence :-

Confidence is the ratio between number of times the pairs purchased and number of times one of the items in the pair was purchased. In mathematical terms it is known as conditional probability. In our example if bread was purchased 1800 times and out of those 1800 purchased 1700 contained butter we would have a confidence of 1700 / 1800 = 94.4%

Now we can check a MBA report. In this report user would select the product they are interested in performing the analysis. In our example it is bread. So it would list all the products that were purchased together with bread.

Product

Frequency

Support

Confidence

Butter

1700

85.0%

94.4%

Jam

1400

70.0%

77.7%

Beer

400

20.0%

22.2%

Based on this report the merchandiser can take necessary decisions. High confidence means there is a strong relationship between the products. Low support means the pair is not purchased a lot. The pair which is having high confidence and high support is the dream pair.

Algorithms:-

One of the main challenges in database mining is developing fast and efficient algorithms that can handle large volumes of data because most mining algorithms perform computation over the entire database and often the databases are very large. For the smooth functioning of any Data Mining project we need strong and efficient algorithms as all we know algorithms are the backbone of any software.

Apriori is the best known algorithm to mine association rules. It uses breadth first search strategy to counting the support of items.

The below part is exclusively for all those who are interested in algorithms.

Apriori Algorithm :-

I cannot give an example better than something given in Wikipedia to explain this algorithm.

A large supermarket tracks sales data by SKU (item), and thus is able to know what items are typically purchased together. Apriori is a moderately efficient way to build a list of frequent purchased item pairs from this data. Let the database of transactions consist of the sets {1,2,3,4}, {2,3,4}, {2,3}, {1,2,4}, {1,2,3,4}, and {2,4}. Each number corresponds to a product such as "butter" or "water". The first step of Apriori to count up the frequencies, called the supports, of each member item separately:

Item

Support

1

3

2

6

3

4

4

5

We can define a minimum support level to qualify as "frequent," which depends on the context. For this case, let min support = 3. Therefore, all are frequent. The next step is to generate a list of all 2-pairs of the frequent items. Had any of the above items not been frequent, they wouldn't have been included as a possible member of possible 2-item pairs. In this way, Apriori prunes the tree of all possible sets.

Item

Support

{1,2}

3

{1,3}

2

{1,4}

3

{2,3}

4

{2,4}

5

{3,4}

3

This is counting up the occurences of each of those pairs in the database. Since minsup=3, we don't need to generate 3-sets involving {1,3}. This is due to the fact that since they're not frequent, no supersets of them can possibly be frequent. Keep going:

Item

Support

{1,2,4}

3

{2,3,4}

3

Note that {1,2,3} and {1,3,4} were not generated or tested. At this point, since the only 4-set is {1,2,3,4} and its subset {1,3} has been ruled out as infrequent.

In much larger datasets, especially those with huge amounts of items present in low quantities and small amounts of items present in big quantities, the gains made by pruning the possible pairs tree like this can be very large.

Pseudo Code :-

Join Step :: Ck is generated by joining Lk-1with itself

Prune Step :: Any (k-1)-itemset that is not frequent cannot be a subset of a frequent k- itemset

Ck: Candidate itemset of size k

Lk: frequent itemset of size k

L1 = {frequent items};

for(k= 1; Lk!=∅; k++) do begin

Ck+1= candidates generated from Lk;

for each transaction t in database do

increment the count of all candidates in Ck+1 that are contained in t.

Lk+1 = candidates in Ck+1with min_support

end

return Lk;

References ::-

http://en.wikipedia.org/wiki/Data_mining

http://en.wikipedia.org/wiki/Association_rule_learning

Books ::

Data Mining: Practical Machine Learning Tools and Techniques by Ian H Witten, Eibe Frank.

Githen Ronney Work for MindTree as a BI consultant

Computing Center Management System (CCMS) Viren Devi

If we want to monitor process chains first transaction that comes to our mind would be RSPC or RSPCM. RSPC shows isolated chains where a person needs to click on each chain to get the data loading status, whereas RSPCM provides the up to date status of the all the process chains in the system but it is only limited to the monitoring of process chains. This blog sheds some light on the Computing Center Management System also called as CCMS which consists of alert monitor, together with the BW monitor, which contain SAP BI relevant monitoring trees and monitoring trees for process chains and consistency checks in the analysis and repair environment.

To get to the CCMS go to RSMON -> Monitors -> BI CCMS or transaction RZ20 -> SAP BI Monitors -> BI Monitor.

Small things to know:

SAP BI CCMS monitor is basically divided into three categories named as BI accelerator monitor, Key performance indicator and BI Monitor. In this blog we will talk about BI monitor.

CCMS BI Monitor

SAP BI CCMS monitor is basically divided into three categories named as BI accelerator monitor, Key performance indicator and BI Monitor. In this blog we will talk about BI monitor.

To collect the logs, CCMS agent uses number of Methods. Each method calls Transaction/ Function Module/ Report or URL to collect the logs. Methods can be accessed from RZ21 -> methods -> Definition OR on CCMS BI monitor screen go to views -> Method Allocation. Later option will populate relevant methods against its corresponding nodes in the tree.

Method Definition

E.g.
Method: RSPC_CCMS_AGENT.
Function module used: RSPC_CCMS_MA_UPDATE

We can define number of days CCMS should hold logs for. It is decided by the parameter DAYS_TO_KEEP_LOGS which is set to 7 by default. This means that CCMS shows process chain runs from the last seven days. In addition we can also define whether method should be executed in the background or foreground under Control tab.

Method Definition

Transactions monitored in BI monitor:

I. RSPC: CCMS provides a consolidated log view of all the process chains in the system. It gives clear picture of whether process chain is in progress, successful or failed by indicating the chain in Yellow, Green and Red colors respectively. By default CCMS agent for process chain is scheduled to run after every 1 hour. It uses job SAP_CCMS_MONI_BATCH_DP to collect all the logs from the system. We can easily change the scheduling in transaction SM37

Process Chains

II. DB02: This transaction keeps the log of the system space management, Performance, Backup and Health of the system also.

a) Space Management: As shown below, it gives how much free space is available, how much PSA table, Temporary table, UNDO table space is available. These spaces need to be potentially analyzed before starting any data loads.

Space Management

b) SAP Consistency: It gives list of all missing/inconsistent objects in the database like Primary index, secondary index, tables, views etc. This especially helps support team while monitoring the reports performance as missing secondary indexes strongly impacts performance of the query.

SAP Consistency

In addition to above it also provides the details about Health of the system, Performance of the system.

III. RSRV (Consistency checks in the analysis and repair environment): It is one of the most essential transaction to check the consistency of all the BI objects like Cube, DSO, Hierarchies, PSA tables, BEx objects, Documents, aggregates, DTP, BIA, Cache, Transaction data, Master data etc. CCMS provides the consolidated view of consistency of all of the above mentioned objects.

RSRV

IV. Transactional RFC and Queued RFC: CCMS also proves to be beneficial to monitor SM58 (Transactional RFC), SMQ1 (Outbound queue), SMQ2 (Inbound queue).It provides details of the calls that failed due to the communication errors, execution errors or lacking server resources. Qout scheduler which can be used to change/analyze number open RFC connections between different systems can also be checked from CCMS.

Transactional RFC and Queued RFC

V. SM37 & SM51 (Background Processing): CCMS is also useful to monitor the background jobs in the system. It provides all the details of which job got aborted on which server.

Background Processing

VI. ALE Idocs Monitoring (BDMO): As shown below CCMS also provides the facility to monitor status of Outbound and Inbound ALE Idocs of the system. It gives step by step update on the status of Idocs which means if Idoc is failed at Idoc interface or if Idoc is dispatched successfully it will be shown in the monitor in the corresponding Status Group.

Idocs Monitoring

From above it seems that SAP has found the best way for any BI consultant especially support team member to monitor the whole system without having to remember any of the system transactions or any extra efforts.

Viren Devi is Certified SAP BI consutant at Capgemini.

SAP BI-IP : Allocation Using FOX Formula Meric Celik

Allocation is a very fundamental procedure of business planning for a company. Especially in Top-Down planning approach, the planned values are required to be allocated to lower levels which are determined by company for specific businesses purposes.

SAP Business Intelligence Integrated Planning (SAP BI-IP) provides the standard allocation formulas which are used for basic allocation requirement. For more complex business cases, standard formulas are not able to fulfill the requirement and in this case FOX which comes with SAP BI-IP can fulfill the complex development requirements.

This blog aims to show the development of FOX formula for sample of specific business case shown below:

Table

Info Provider (for planning purposes):

  • Actual (ZACTUAL)
  • Plan (ZPLAN)
  • Discount Ratio (ZDISCRAT)
  • Multiprovider (ZMPPLAN)

Key Figures:

  • Sales (ZSALES)
  • Trade Promotion (ZTRADPR)

Allocation Rule:

  • Trade Promotion (TP) --> top-down planning approach; Company agrees on a yearly TP plan value for each Sales Office.
  • Sales --> Actual data of the company is available on all bases.
  • Discount --> Actual discount ratios are available on all bases and entered to system manually.

Allocation: Trade Promotion should be allocated to all level of data shown above using Invoiced Sales actual as reference data.

Invoiced Sales = Sales (Actual) - Discount (Actual)

At that point, it is hard and sometimes inefficient to use the standard distribution formula in BI IP. Therefore, the allocation formula is going to be developed as in the below FOX formula.

*****************************************************************

*****************************************************************

*Data Declaration

DATA CY TYPE 0CALYEAR.

DATA CM TYPE 0CALMONTH.

DATA SALES_OFF TYPE ZSALES_OFF.

DATA BRAND TYPE ZBRAND.

DATA MATERIAL TYPE 0MATERIAL.

DATA IN_TOT TYPE F.

DATA TP_TOT TYPE F.

FOREACH SALES_OFF, CY.

*** Yearly TP value***

TP_TOT = {ZTRADPR, #, CM, SALES_OFF, #, #, ZPLAN}.

***TOTAL Invoiced Sales Calculation***

FOREACH CM, BRAND, MATERIAL.

IN_TOT = IN_TOT + {ZSALES, CY, CM, SALES_OFF, BRAND, MATERIAL, ZACTUAL} – ({ZSALES, CY, CM, SALES_OFF, BRAND, MATERIAL, ZACTUAL}*{ZDISCOUNT, CY, CM, SALES_OFF, BRAND, MATERIAL, ZDISCRAT}).

ENDFOR.

***Allocation***

FOREACH CY, CM, BRAND, MATERIAL.

{ZTRADPR, CY, CM, SALES_OFF, BRAND, MATERIAL, ZPLAN} = TP_TOT * ({ZSALES, CY, CM, SALES_OFF, BRAND, MATERIAL, ZACTUAL} – ({ZSALES, CY, CM, SALES_OFF, BRAND, MATERIAL, ZACTUAL}*{ZDISCOUNT, CY, CM, SALES_OFF, BRAND, MATERIAL, ZDISCRAT})) / IN_TOT.

ENDFOR.

***resetting allocation for different sales office and year combination***

IN_TOT = 0.

ENDFOR.

*****************************************************************

*****************************************************************

This FOX Formula above is the sample of how to apply the complex allocation rules. In some cases, the business process is going to be more complicated then the case above. However, by applying the other various useful functionalities of Fox formula like read variable value (instead of defining in for loop), master data attribute read, concatenate, .etc the logic is going to be same in every business process.

Assumption:

  • FOX formula in IP is implemented after creating the basic necessary IP processes like aggregation level, filters, etc. as well as BI developments like planning cubes .etc. It is assumed that everything is ready for FOX formula implementation.

Meric Celik SAP BI Consultant in Accenture Istanbul Office.

The SAP BusinessObjects Community Newsletter has launched! Ryan Oliver

Hello, as Editor of the BOC Newsletter I am pleased to announce that the first issue of the new SAP BusinessObjects Community Newsletter has been released in North America (as a monthly publication) and globally (as a quarterly publication), bringing you the latest in technical Crystal Reports news and SAP BusinessObjects solutions. If you have not yet updated your profile to receive the 'BOC Newsletter', please do so here.

Formerly called 'DeveloperLink', the newsletter aims to bring you a wealth of technical resources that has now evolved to include resources on how Business Objects and SAP products can work together.

If you are a Crystal Reports developer, or integrate, deploy or design with Business Objects BI solutions, then you should be subscribed to this technical newsletter. To get a sense of what you will receive monthly, take a look at our newsletter archive page.

Since 2006, we have been providing our subscribers valuable resources for Crystal Reports, Crystal Reports Server, SAP BusinessObjects Enterprise, and Xcelsius including:

  • Technical webinars
  • Free Trial downloads
  • Free product downloads
  • White papers
  • Technical docs
  • Code samples
  • Report samples
  • Dashboards
  • Our own Developer Blogs
  • 3rd party user articles
  • Product offers and announcements
  • Event info
  • Knowledge base articles

The newsletter concentrates on giving the user .NET and Java themed articles to ensure we cover the development environments our products use.

As you can see by the length of the newsletters, we have a lot of internal content each month waiting to get in the hands of our users.

Also, I have created a content Wiki that you may find useful in the future for being heard on technical topic requests that are important to you. Check it out here.

Thanks.

Ryan

Ryan Oliver Ryan is a Marketing Specialist and Editor of the BOC Newsletter for SAP BusinessObjects.

New opportunities in economic crisis - how does the current economic crisis change the Chemical Industry sector? Thorsten Wenzel

The economic crisis has also reached the Chemical Industry. The current situation will, therefore, accelerate change beyond consolidation and classical cost savings in purchasing, logistics and general sustainability. Product and technology cycles will continue to be reduced and innovation is and remains a pivot for successful development.

If we think about Research & Development in Chemicals throughout the last two centuries, we will find very few joint development approaches in the past. Since a couple of years this is changing. It started via co-operations within the chemical ecosystem between different chemical producers, also promoted via demerger and out-sourcing activities. A few chemical producers are ahead of this development. They are entering into the so called Hybrid Innovation Area, which can be described by joint or custom development activities crossing industry boundaries, developing new products and applications including stakeholders like universities, research institutes, other industries, customers and even competitors.

Especially the role of the sales representative is changing from a classical sales function towards a more customer process oriented application engineer. Appropriate IT infrastructures and processes are required to cover the need to exchange information and data between all parties involved.

Based on Service Oriented Architecture, product innovators are able to orchestrate alliances and intellectual capital. SAP is actively working on this topic together with our customers in order to shape a solution able to cover these complex IT requirements beyond the company boundaries.

We would be glad to welcome you for these discussions and opportunity to share experiences and expectations. Please contact me or simply provide a description of what you would like to see in this type of application on this site.

Thorsten Wenzel Director Business Development IBU Chemicals

Which Dashboarding Solution Is Right for You? Jason Cao

"Xcelsius Enterprise and SAP BusinessObjects Dashboard Builder are the go-forward offerings for customers and partners who are looking for dashboard and visualization solutions." This from the Product Managers of Dashboard Builder and Xcelsius Enterprise.

Dashboard Builder Xcelsius Enterprise

Join Erica Lailhacar (Dashboard Builder) and Francois Imberton (Xcelsius Enterprise), in a live webinar on April 1st, 2009 10am PST, as they discuss how to determine which dashboarding solution to use, and describe the advantages of using both products together to get the best value of the two solutions. There will also be a demo of new features in Dashboard Builder 3.1.

Read more and register for this upcoming webinar.

Jason Cao is a member of the Business Objects Community team, and manages community engagement and collaboration.