Tuesday, April 12, 2011

Difference between Operational Data Store(ODS) and InfoCube

Infocubes have a multidimensional structure with dimension tables(max 16, 13 custom) and one fact table. they are meant for summarised records.
ODS store data at a more granular level. they have flat structures like a table in R/3. They have a unique feature "overwrite" which is absent in case of cubes.
You can use ODS to load to cube further.
Anyway, one major difference is the manner of data storage. In ODS, data is stored in flat tables. By flat we mean to say ordinary transparent table whereas in a CUBE, it composed of multiple tables arranged in a STAR SCHEMA joined by SIDs. The purpose is to do MULTI-DIMENSIONAL Reporting
Another difference is : In ODS, you can update an existing record given the KEY. In CUBES, theres no such thing. It will accept duplicate records and during reporting, SUM the keyfigures up. Theres no EDIT previous record contents just ADD. With ODS, the procedure is UPDATE IF EXISTING (base from the Table Key) otherwise ADD RECORD.
ODS
Stores line item level detail, more granular Can't create aggregates on ODS ODS are based on flat tables Only two dimensional reporting possible on ODS.  Overwrite feature available while loading records
Infocube
- Stores summarized data, less granular. 
- Aggregates can be created on top of Infocubes for better performance of Queries.  
- Multi-dimensional reporting possible on Infocube.
- Theres no overwrite feature while loading records.
Infocubes are MDM objects that fact table and dimension table are available whereas ODS is not a MDM object there are no fact tables and dimension tables. It consists of flat transparent tables.
In infocubes there are characteristics and keyfigures but in ods key fields and data fields. we can keep non key characteristics in data fields.
Some times we need detailed reports we can get through ODS. ODS are used to store data in a granular form i.e level of detail is more. The data in the infocube is in aggregated form.
From reporting point of view ods is used for operational reporting where as infocubes for multidimensional reporting.
ODS are used to merge data from one or more infosources but infocubes does not have that facility.
The default update type for an ODS object is overwrite for infocube it is addition. ODS are used to implement delta in BW. Data is loaded into the ODS object as new records or updating existing records in change log or overwrite existing records in active data table using 0record mode.
You cannot load data using Idoc transfer method in ODS but u can do in infocube.
You cannot create aggregate on ODS. You cannot create infosets on infocube.
ODS objects can be used.
When you want to use the facility of overwrite.  If you want to overwrite nonkey characteristics and key figures. If you want detailed reports you can use ODS.
If you want to merge data from two or more infosources you can use ODS.  It allows you to drill down from infocube to ODS through RRI interface.
ODS objects can be used in the following scenarios. ODS is not a mandatory but depending on the requirements we have to use it.
When you want to use the facility of overwrite.  If you want to overwrite nonkey characteristics and key figures in the data fields column.
If you want detailed reports, you can use ODS.
If you want to merge data from two or more infosources you can use ODS.
It allows you to drill down from infocube to ODS through RRI interface if u want detailed data from ODS.
If you want to create an external file.
The most important difference between ODS and BW is the existence of key fields in the ODS. In the ODS you can have up to 16 info objects as key fields. Any other info objects will either be added or overwritten! So if you have flat files and want to be able to upload them multiple times you should not load them directly into the info cube, otherwise you need to delete the old request before uploading a new one. There is the disadvantage that if you delete rows in the flat file the rows are not deleted in the ODS.
I also use ODS-Objects to upload control data for update or transfer routines. You can simply do a select on the ODS-Table /BIC/A00 to get the data.
ODS is used as an intermediate storage area of operational data for the data ware house . ODS contains high granular data . ODS are based on flat tables, resulting in simple modeling of ODS .  We can cleanse transform merge sort data to build staging tables that can later be used to populate INOFCUBE .
An infocube is a multidimentionsl dat acontainer used as a basis for analysis and reporting processing. The infocube is a fact table and their associated dimension tables in a star schema. It looks like a fact table appears in the middle of the graphic, along with several surrounding dimension tables. The central fact is usually very large, measured in gigabytes. it is the table from which you retrieve the interesting data. the size of the dimension tables amounts to only 1 to 5 percent of hte size of the fact table. Common dimensions are unit & time etc. 
There are different type of infocubes in BW, such as basic infocubes, remote infocubes etc.  
An ODS is a flat data container used for reporting and data cleansing/quality assurance purpose. They are not based on star schema and are used primaily for detail reporting rather than for dimensional analyais.
An infocube has a fact table, which contains his facts (key figures) and a relation to dimension tables. This means that an infocube exists of more than one table. These tables all relate to each other. This is also called the star scheme, because the dimension tables all relate to the fact table, which is the central point. A dimension is for example the customer dimension, which contains all data that is important for the customer.
An ODS is a flat structure. It is just one table that contains all data.  Most of the time you use an ODS for line item data. Then you aggregate this data to an infocube.
ODS holds transactional level data..Its just as a flat table.  Its not based on multidimensional model. ODS have three tables 1. Active table 2. change log 3. New table
Cube holds aggregated data which is not as detailed as ODS. Cube is based on multidimensional model. Cube have 2 tables 1. E table 2. F table.

More Interview Questions On CIF

1. We have 50 integration models for each object type, since we have 50 plants. Should we define fewer integration models?
Before PlugIn 2002.1, we recommend that you define fewer models for performance reasons. Generally, the size of the integration models depends on the data volume for each plant. To optimize the integration model number, we recommend that you purchase consulting expertise.
As of Plug-In 2002.1, the "Runtime version of the integration model" is available. Using the runtime version guarantees better performance in the online operation (also refer to the documentation for the report RCIFIMAX).
Even though the number of integration models does not affect the performance significantly, we recommend that you keep the number of integration models low, in order not to increase the runtime for generating the runtime model.
That is, do not regularly create new integration models, rather only create new versions of integration models.

You can find release notes for the PlugIn on SAP Service Marketplace at: "http://service.sap.com/R3-PLUG-IN" -> Media Center -> Release Notes PI 2002.1 Release Notes SAP APO.

2. Do we have to transfer the master data of the vendor together with the stock data, so that consignment stocks are transferred?
Yes - this ensures that the consignment stock is correctly linked to the vendor location in SAP APO.

3. Master record objects that were changed since the last transfer are transferred again to SAP APO by initial transfer.
Does this mean that the report RCPTRAN4 (evaluate and send change recordings) does not have to run?
And what about the report RBDCPCLR (delete change pointers) for reorganizing the change pointers?
You do not have to execute the report RCPTRAN4 in this case, since the dataset in SAP APO is up to date due to the initial data transfer.
You should use the report RBDCPCLR to delete "old" change pointers.

4. The master and movement data for a material 4711 is in two active integration models (A+B). Assuming that one of the two is deactivated - what happens then?
The master data and movement data remains active. See also Note 533755 "Description of the delta logic or the program RIMODINI".

5a. What happens if you deactivate an integration model that has master record objects?
Planning in SAP APO is still possible. However, you can no longer transfer the transaction data to SAP R/3.
5b. What happens with the master and movement data in SAP APO after the master data was deactivated?
The master data remains in SAP APO.
5c. What happens with the transaction data if there is another activation?
The transaction data is transferred again. Provided that you reschedule (for example plan automatically (not for plan/manufacturing orders)), the old transaction data is deleted. Note that the integration model for the master data must also be active if the transaction data is transferred again.

6. How do I change from small to large integration models?
You activate the large model (all data already selected in active models is not transferred again) and then deactivate the small models.

7. Why are my orders not transferred from SAP R/3 to SAP APO?
Refer to the information contained in Note 424927 "No order transfer from R/3 to APO" and check your settings accordingly.

8. My material removals are not transferred in the APO order, but the stocks change.
Refer to the information contained in Note 421940 "No reduction of order reservations in APO" and check your settings accordingly.

9. Can data be transferred from SAP R/3 to SAP APO using BTE change pointers (for example from the table MBEW table using user exits)?
Since the APO standard system does not require data from the table MBEW, this is not transferred to the CIF during the transfer of data changes using BTE. Via BTE, data for all SAP standard fields is transferred from SAP R/3 to SAP APO from the table MARA (plant-independent material data), the table MARC (plant-dependent material data), the table MARM (conversion of units of measure) and the table MAKT (material texts). In the customer exit in SAP R/3 also only this data is available. An alternative here is the transfer of the material master changes using ALE change pointers.
For example: Transferring the "floating average price/periodic unit price" (MBEV-VERPR) using the user exit CIFMAT01 does not work. For this, the BD52 Customizing must be changed and the data must be transferred using the ALE method.
Changes to customer-specific fields can also only be transferred to SAP APO using ALE in connection with customer exits.

10. How can I avoid overlaps and thereby inconsistencies during the integration model transfer?
If you use parallel processing for the initial data transfer, transaction data may be transferred to SAP APO before the corresponding master data is available in SAP APO. For example, you can then create in-house production orders in SAP APO without PPM even though this should not be the case. Unfortunately, this cannot be prevented technically. The integration models must be cut accordingly and scheduled in background jobs so that this does not happen. Background jobs also check whether queues have been processed correctly and without errors.

11. Where can I find information about parallel processing during the initial data transfer?
You can find release notes for the PlugIn on SAP Service Marketplace at: "http://service.sap.com/R3-PLUG-IN" -> Media Center -> Release Notes => PI 2002.1 Release Notes SAP APO.
Application log
1. Is there a way of analyzing errors in the partner system directly from the application log?
For information about this, see the following notes:
Note 396838 "R/3: Displaying application log from queue entry"
Note 396839 "APO: Jump to application log from incorrect queue entry"
Note 457399 "Branching to the application log with inbound queues"
Note 457418 "APO: Branching to the application log with inbound queues"

2. How can I find CIF logs?
In the R/3 and APO SAP systems, you can analyze the application log using the following transactions:
SAP R/3 transaction CFG1 (see also Note 544011) and SAP APO transaction /N/SAPAPO/C3 (see also Note 544389).
Interactive user
Question: When do I have to create a dialog user if no ATP check is to be used?
Answer: This is necessary for analyzing the data transfer and for debugging. Also check note 352844
As of PlugIn 2002.2, it is possible to work with separate authorizations for every application.
SNP PPMs
Question: Are SNP PPMs taken into account in change management?
Answer: No (version PlugIn 2001.2).
Questions on release statuses
1. You want to use a new SAP APO 3.1 with the same system name as your old SAP APO 3.0, which is deactivated. Does this work?
Yes, as long as the "old" APO System is deactivated. The name for a logical system (LOGSYS) can only be assigned once.
You must also consider the following: In the SAP R/3 system, unique GUIDs are created for the mapping between SAP R/3 and SAP APO documents. See the "CIF*MAP"R/3 tables. This may cause discrepancies during the assignment of GUIDs and documents in SAP APO when you start a new initial data transfer.

2. Does SAP APO 3.1 work with PI 2001.1?
PI 2001.2 is the minimum requirement in this case. For further questions on the PlugIn release, go to SAP Service Marketplace. Here you will find further information at "http://service.sap.com/R3-PLUG-IN" -> Integration of SAP R/3 and mySAP.com Components.
1 SAP R/3 with several SAP APOs
Question: A client of an SAP R/3 system is to be operated with several SAP APO Systems (Release 3.0 and 3.1). Does this cause problems?
Answer: In theory, this does not cause problems. However, note the following: A planned order or production order (for example order 4711), and a PREQ (PReq 4712, pos 0010) or a sales order item can only be sent to a SAP APO system, in other words a PReq created in SAP APO system 1 is not copied to SAP APO system 2. The SAP APO systems must plan different material/plant combinations.

No stock transfers should occur between the SAP APO systems.

This would cause problems because a transaction date that was sent from the R/3 system to both APO systems may transfer different updates in the retransfer from both APO systems. Even if the updates from both APO systems are the same, these cannot be processed in such a way that a consistent status is achieved afterwards.

In the case of other objects like TP/VS and production campaigns, problems may occur
because updates from APO systems can no longer occur in an indivisible logical unit of work (LUW).

This may be the case if some of the referencing transaction data originates in one of the APO systems and other transaction data originates in the other APO system.


Further problem may occur in the APO systems due to different release levels
if the release level of the APO system is relevant for shipping in the R/3 outbound.

In this case, it cannot be guaranteed that all target systems will always be handled in a loop for all object types before each APO release query.
qRFC monitor (transaction SMQ1/2)
1. Can I restrict the access of the 'Delete' function in transaction SMQ1 using authorizations (the display and processing functions should still be available to the user)?
There are three authorization groups for transactions SMQ1 and SMQ2:
  • * Group 1 cannot call SMQ1 SMQ2 at all.
  • * Group 2 can call SMQ1 SMQ2 but it can only display it (not delete it!) and activate queues The transaction authorization for SMQ1 and SMQ2 is required for this.
  • * Group 3 can call SMQ1, SMQ2 and use all functions. The value NADM must be defined for this in the object S_ADMI_FCD.

2. Is there a better display of the queues than the qRFC monitor for outbound queues (SMQ1) or inbound queues (SMQ2)?
  • Yes, in SAP APO you have the SCM Queue Manager in transaction /N/SAPAPO/CQ (see also Note 419178).
  • As of SCM 4.1, you can also the CIF cockpit (transaction /SAPAPO/CC) that provides an overview of and access to all CIF-relevant transactions and Customizing settings of the APO system and all connected ERP systems.
The CIF Cockpit
As of SCM 4.1, you can use the CIF cockpit (transaction /SAPAPO/CC) in SAP APO. It provides an overview of and access to all CIF-relevant transactions and Customizing settings of the APO system and all connected ERP systems.
CIF queue names
For a list of all current CIF queue names that are used to transfer data between ERP systems and SAP APO, refer to Note 786446.

Friday, January 21, 2011

Nice SCM portal

Hi Guys,
Unfortunately to day i saw one of the blog by Shaun Snapp.Really his articles gives you a very useful info regarding SCM APO.
If possible Visit this link 
http://www.scmfocus.com/

Thursday, January 13, 2011

BAPI and ALE Integration


The objective of "Business Application Programming Interfaces" (BAPIs) and "Application Link Enabling" integration is to enable future ALE scenarios to use BAPI interfaces. One advantage is that it will be much easier for both SAP Development and SAP customers to develop new ALE scenarios. Another advantage is that BAPI interfaces will be able to use already existing ALE functions, (e.g. error handling and writing links asynchronously).
Further advantages are:
  • object oriented approach
  • application maintains one interface only
  • reduction of generation program errors
Description of Function
When a BAPI is defined, ALE outbound and inbound interfaces are generated and entered in transport requests, provided that the following functions are generated at the time the BAPI is defined:
  • an IDoc type and its segments (IDoc = intermediate document)
  • a "wrapper" function module which decides whether the BAPI is called locally or from another system (see Overview Graphic). Local calls can call the BAPI immediately or via an IDoc which restarts it.
  • an ALE outbound function module which puts BAPI interface data into an IDoc and triggers ALE outbound processing
  • an ALE inbound function module which transfers BAPI interface data from an IDoc into the BAPI interface structure and calls the BAPI
  • ALE customizing for the new interface.
The flow diagram in the Overview Graphic shows the run time BAPI call flow logic.
The generation can also be used when an R/3 System is connected to a non-R/3 System. "One-way" means that either outbound or inbound but not both is implemented in the R/3 System. This means that a one-way interface cannot be used to exchange data between two R/3 Systems.
The generation can only be used for asynchronous interfaces, i.e. a receiver processes data without return parameters to the sender.
In principle the generation can also be used to support an IDoc interface for Electronic Data Interchange (EDI).



To call up the BAPI "BAPI_X_CREATE" in System 1, the program calls the generated "wrapper" function module "ALE_X_CREATE" whose interface contains all the BAPI interface parameters.
Application Area
All applications which create a write BAPI can use this functionality, especially when the BAPI is to be called from another R/3 System. Customers developing their own ALE scenarios also benefit from these advantages.

Tuesday, November 30, 2010

SAP Supply Network Collaboration (SNC) Overview

Overview of SAP SNC:
SAP Supply Network Collaboration (SNC) is a component of SAP Supply Chain Management (SAP SCM). It was formerly known as SAP Inventory Collaboration Hub (SAP SNC). Because enhancements broadened the collaboration environment, the product was renamed to reflect its extended capabilities for collaboration with suppliers (extending beyond inventory collaboration).
The supplier does not require a specialized electronic data interchange (EDI) infrastructure to integrate the supply network, which SNC makes this solution usable by companies of all sizes. All that is required on the supplier side is Internet access and a web browser. 

It serves as a joint-use platform for all business partners involved in the inventory collaboration process.
Supply Chain Collaboration is a key area in the mySAP SCM solution that supports collaborative planning and the exchange of documents with suppliers and customers. Collaborative planning can be done in the SAP Supply Network Collaboration (SNC) or in APO.
SAP Supply Network Collaboration supports the two collaborative business processes SMI (Supplier
Managed Inventory) and VMI (Vendor Manged Inventory).

Supplier Collaboration:
You can use  SAP Supply Network Collaboration (SNC)to optimize cooperation with your vendors. You and your vendors use SAP ICH as a common platform to control and monitor the replenishment process for materials.
For example, the vendor takes on responsibility for the stocks at your location. You and your business partner make an agreement about the minimum and maximum stock levels for the product at this location. The vendor monitors the stock level you have agreed upon with the SAP ICH system, making sure your stocks are replenished promptly and informs you about waiting deliveries. This ensure production flows smoothly and without any downtimes.Both business partners can check the stock level of all materials using the SAP SNC system. The SAP SNC system generates alerts and informs both partners about critical situations which enable them to react promptly. Employees can make individual settings about how they would like to be informed of these alerts,
like for instance by e-mail. This is especially helpful for employees that do not work with SAP SNC on a daily basis.

Customer Collaboration:
Using the VMI process, a VMI analyst develops an unrestricted forecast about future customer demands. Certain statistical forecasting and extrapolation methods are applied to historical sales data transferred from the customer when making this forecast.By means of replenishment planning, you can then determine quantities to be
delivered to a customer location (such as a distribution center) in order to meet customer demand and maintain the required degree of service. Starting with the requirements forecast, the replenishment process determines the optimum short and medium-term plan, which is required to meet the estimated requirement. This plan contains the quantites to be transported from the vendor distribution center (or production plant) to the customer distribution center (or branch).
Feasible transportation units can then be assembled based on business rules. Such rules concern constraints for the relevant means of transportation. This ensures that the appropriate minimum and maximum capacities are kept to.
In the consumer goods industry, menu pricing agreements bewteen vendors and cusomters are defining for these business rules. These agreements specify how transportation loads are to be put together with respect to logistical parameters such as permitted product and pallet combination, maximum weight and volume.
It is in the interests of both vendors and customers to optimize the cost-use situation in order to increase the overall profitability of the supply chain. In this way, transportation and storage costs are kept to a minimum for vendors.
Customers achieve extremely profitable replenishment intervals without their warehouse becoming overstocked and are at the same time in a position to react quickly to short-term fluctuations in demand.

So where do we collaborate exactly,
Customers at Finished good Replenishment.
Suppliers at Raw Material Procurement.

Integration Levels:

There are three integration levels available in between customer and supplier when we are talking about SAP SNC

1.       Web portal – generally this can be using when in low transaction volumes
2.        File Transfer – When supplies having limited internet connectivity this can be used
3.       B2B – One of the best one for high transaction volumes





 
Data Flow in SAP SNC:

 
We know how the data will transfer from ECC to APO system. When we are talking about the data transfer in between these two systems core interface(CIF) plays a vital role while transferring the Master data between ECC and SNC system. But the transactional data can be able to transfer through an middle ware too called Process Integration.

ECC Master Data:
 Plant
Material
Vendor Master
Info Records
Customer Master
Scheduling Agreements

SNC Master Data:

1.Generic Master Data.
2. Process-specific Master Data

Generic Master Data:
Model 000,
Planning version 000,
Ship-from location,
Customer location,
Products,
Customer,
Supplier,
Transportation Lane between ship from location and customer location.

Process-specific Master Data:

You maintain process specific master data for certain SNC business processes, example for work order collaboration master data is required before we start working on the scenario.
Tcode : /SCA/MFGCFG 

Data Storage in SAP SNC:

SAP SNC has its own data storage model, consisting of Time Series Data Management, or TSDM, which stores and retrieves time series data such as forecasts Order Data Management, or ODM, which stores and retrieves order documents of any type that might occur in an SCM execution or planning process, such as purchase orders or deliveries Lean Inventory Management, or LIME, which stores and retrieves inventory data.

The liveCache in which SAP APO stores transaction data is not used by SAP ICH.
You do not need to configure Time Series Data Management in order to run the basic SNC processes.
You need to activate Order Data Management, but no further configuration is required.


Monday, November 29, 2010

FAQS on Core Interface

Which operations of a routing or recipe in R/3 or ERP are transferred to APO PPMs?
When PPMs are created in APO from Receipes in R/3 only those phases containing non-zero machine times are brought over. Other phases having only labour (resource category 003) or machine time (resource category 001) = 0 does not come through.

What determines the validity period of a resource in liveCache?
CFC9 parameter in R/3 does not set the From/to validity date for Resources when Work Centres are CIFed from R/3 to APO. Actually that is set as per entry in table /SAPAPO/RESLCT - Length of Time Stream or Bucket Vector in liveCache.

What needs to be done to debug CIF related enhancements?
In order to debug CIF related user exits or other CIF queues, set the R/3 RFC user (e.g. STGUSER) to Dialog user and then set queues to Debugging On/Record T/QRFCs in the transaction CFC2 in R/3. This is for queues coming inbound to APO. For queues coming inbound to R/3 set the APO RFC user id (e.g. APSUSER)

How can transaction data be reconciled with APO from R/3 or ERP side?
Program RCIFORDT can be used to reconcile transaction data form R/3 side. Refer 733110 - as a long term solution implement this BADI on APO side
OSS Notes 627630, 804034 on R/3 side and with the BADI OSS note 800286 to be applied
Which table stores Change Pointers in ERP or R/3?
The table BDCPV is to store Change Pointers in both APO and R/3.
Refer OSS Note 329110.

Which report can be used to clear Change Pointers in ERP or R/3?
Program RBDCPCLR can be used to clear change pointers from the Change Pointer table. The message type CIFSRC is for changes to Source of Supply (i.e. Purchasing Inforecords). The CIF Change Transfer program (CFP1) can fail due to a large number of records in the Change Pointer table. Other message types are CIFMAT for material, CIFVEN for Vendor Master, CIFCUS for Customer master.
What is Customer Consignment Stock and how is it transferred to APO?
Consignment stock at Customer is stock at customer premises but owned by the supplier. Prerequisite to have Consignment Stock at Customer is Customer as a Location and Product Master at that Customer Location. To transfer Customer Consignment Stock from ERP to APO both "Customer" and "Special Stock at Customer" should be part of an Integration Model during Initial Transfer. For Consignment Stock Batches "Storage Location Stock" object must be included in the Integration Model.
Reference: Note 409298

What is Vendor Consignment Stock and how is it transferred to APO?
Vendor Consignment stock is stock at the plant location and treated as normal storage location stock but owned by vendor. Vendor Consignment Stock is transferred from ERP to APO as part of "Storage Location Stock" Integration Model. However the Vendor must be part of an active
Integration Model.

How is Stock information stored in APO?
Stock in SCM APO system (upto 4.1) consists of Stock Anchor stored in Database and Stock Item stored in liveCache. The liveCache Consistency Check (transaction /SAPAPO/OM17) carries out the consistency check between the APO Database and liveCache. It should be executed periodically to delete obsolete stock anchors from the database.
Reference: Note 492591
As of SCM 5.0 Stock information is stored in liveCache table /SAPAPO/STOCKANC.
Reference: Note 837744

What are the userexits for Integration of Stocks?
On R/3 side the userexit is enhancement CIFSTK01 while on APO side it is enhancement APOCF011. The source code in the userexit should be copied to the CIF Compare/Reconcile Report (Delta Report) BAdI (method RELEVANT_FOR_COMPARE_R3_STOCK of BAdI definition /SAPAPO/CIF_DELTA3) for correctness of the report.
Reference: Note 492591

How is Inspection Lots handled in APO?
From SCM 4.0 Inspection Lots are separate objects in APO retaining the end dates. Hence Inspection Lots quantities does not shows up as Stock in Quality Inspection.
How is Cross-company Stock In Transit handled in APO?
Cross-company Stock In Transit is determined dynamically in R/3 and hence not transferred to APO. In APO Stock In Transit at the receiving plant can be handled by transferring Inbound Shipping Notification or Goods Confirmations from R/3.

What is the Data Load for Transaction Data that the CIF can typically handle in a day?
"It Depends" (need better answer here)

Configuring E-Mail Alerts in SAP SNC

SAP Supply Network Collaboration (SAP SNC) is an innovative, Web-based component that supports SAP's global vision for adaptive supply chain networks. The component, based on the mySAPTM Supply Chain Management (mySAP SCM) solution, supports a range of cutting-edge business scenarios in supplier and customer collaboration environments. SAP SNC provides a powerful and comprehensive means of enhancing cooperation, efficiency, and knowledge sharing throughout the supply chain.

This example demonstrates creation of Alerts for New Purchase Orders received in SAP SNC and sending E-Mail Alerts to the Supplier informing him about the new PO created in SNC.

For Configuring a Supplier Collaboration Scenario specific E-Mail Alerts, we need to configure a Message profile with the information of Fields which needs to be sent in an E-Mail.

For Creating Message Profile from SAP Easy Access Menu of SCM (SNC), use the following path.
SAP Menu -> SCM Basis -> Alert Notification Engine -> Settings -> Create/Change Message Profile or T.Code "/SCMB/ANOTMP" as shown in the following fig.



In the Change view Click "New Entries" Button to create a new Message Profile.



Create the Message Profile with the details as shown below.



After Creating Message Profile, add parameters to Profile by selecting the Message Profile and clicking on "Message Parameters" in left side tree. Click New Entries and add Parameters.



After Creating Message Profile, create Alert Profile using Alert Monitor using the following menu Path.
SAP Menu -> SCM Basis -> Alert Monitor -> Alert Monitor or T.Code "/SAPAPO/AMON1".




In the following screen, select the Button "Overall Profile" to create Alert Notification Profile and Alert Profile.




In the next screen, Use menu Option Goto -> Automatic Messaging





In the Create/Change Alert Notification Profile, click "Generate New Profile" button.




Provide the name and parameters for Alert Notification profile as shown below.








After creating Alert Notification Profile, Click the "Confirm Profile" Button.



Click "Back" button or Function key F3 and create Alert Profile as shown below.

Provide a name for Alert Profile and select the tab "SNC-SMI". In the Alert types, select Purchase Order and check the items New Purchase Order Item, Change Purchase Order Item checkboxes. Click the "Save SNC-SMI Alert Profile" button under Application Specific Alert Profile.



After creating Alert Profile, the Profile needs to be applied to Suppliers using web UI of SNC using the following Menu Path.

SAP Menu -> Supply Network Collaboration -> Web UIs for SAP SNC -> Special Views -> Customer View or T.Code "/SCA/ICH_C".



Log on to web UI by providing ID and password. Select the Alert Monitor under Menu "Exceptions". In the Alert Monitor screen, select the button next to button "Reset" and select option "Save as" as shown below.



In the pop up window provide a Name and select "Partner-Specific" from the dropdown list and click "OK" Button.



Select Supplier Number, Alert Category and Alert Type as shown below and "SAVE" the selection as shown below.





After creating the Selection variant, click "Set Notification" button to create Profile.



Provide the details like E-Mail ID, Message Profile, and Minimum Priority as shown below and click "Save".



After this configuration, the profile can be tested by sending a Purchase Order to SNC. As soon as SNC receives Purchase Order and E-Mail will be triggered and sent out to Supplier informing him about the new Purchase order received in SNC.