Needing to introduce new depreciation areas in SAP Asset Accounting? Not specifying a date interval for depreciation parameters can have far-reaching consequences. Learn why from one of our SAP experts!
Period end closing is always a critical and high visibility item in any ERP. The timely and effective monitoring and control of period end closing is top most need for the finance department. For this purpose, you must process a sequence of interdependent steps diligently in a specific and well-defined order. Entities use various tools for this, ranging from excel to more sophisticated tools like SAP Financial Closing Cockpit, Blackline’s Runbook etc.
Learn the basics of the Material Ledger in SAP S/4HANA! Explore processes such as actual costing, balance sheet valuation, and COGS valuation—and how they’ve changed since SAP ERP.
Design Summary and Guiding Principles
Develop a solution to improve the budgeting and consolidation business processes of the Company.
Integrate data in a centrally accessible repository storing one version of the truth.
Make the monthly reporting preparation more transparent and efficient by aligning process owners. Enhance collaboration between contributors.
Improve business information, enable business visibility and analysis of the cost base and standardize the management reporting outputs.
With SAP being a 46+ year old company, most of the SAP consultants are seasoned in on-premise world having deep knowledge and expertise in configuring systems, designing SAP according to business needs and following various implementation approaches. Now S/4HANA cloud version is gaining popularity and very useful in various scenarios, like professional services, affiliates of larger enterprises as two-tier implementation etc., so learning how it is different from on-premise world is useful. (Also refer blog “Critical insight into S/4HANA Cloud compared to S/4HANA On Premise” showing some differences in S/4HANA cloud vs S/4HANA on-premise.)
Let’s observe some key watch-outs in a S4HC implementation:
SAP Fiori is a new user experience (UX) for SAP software and applications. It provides a set of applications that are used in regular business functions such as work approvals, financial apps, calculation apps, and various self-service apps. The SAP user interface, or SAP GUI as we know it today, was first introduced in 1992 together with the official release of SAP R/3. SAP R/3, the client server edition, was the successor to the SAP R/2 release, the mainframe edition. Although SAP has made several attempts to modernize SAP GUI, an end user from the time it was introduced would still find their way around today. Many transactions and screens have remained the same or changed very little. Since the initial release of SAP GUI, SAP has released several alternative user interfaces such as the SAP Workplace (which was part of the mySAP.com offering), the SAP Enterprise Portal, and the NetWeaver Business Client or NWBC. None were as successful as SAP GUI except, perhaps, for the NetWeaver Business Client. The NetWeaver Business Client is, however, an extension to the SAP GUI. The conclusion of all this is that although many people complained about the old-fashioned look of SAP GUI, they kept using it and will probably continue to do so in the future. But there is no denying the fact that the user community is changing fast. The SAP users of tomorrow are the youngsters of today, who are used to accessing data from their mobile devices. To them, SAP GUI is a relic from the dark ages. This shift is not limited to youngsters—many end users want data access from any device, from any place, at any time. SAP released SAP Fiori to respond to this demand. SAP Fiori is built using modern design principles you might expect from applications designed for smartphones and tablets. There are already more than 500 role-based Fiori applications such as for HR, Finance, and Business Intelligence. An SAP Fiori application is always limited to a specific task or activity. The design is responsive and deployable on multiple platforms. There are three types of SAP Fiori applications: transactional apps, fact sheets, and analytical apps. SECURING FIORI 46
Transactional or task-based applications
The transactional SAP Fiori applications are limited to specific tasks such as entering a holiday request or expense note. They give end users fast access to data and represent a simplified view of an existing business process or workflow.
Fact sheets have far more capabilities than transactional applications. From a fact sheet, you can drill down into the details. You can even navigate from one fact sheet to another or jump to the related transactional applications. For fact sheets, the underlying database must be SAP HANA. An example of a fact sheet is an application that shows the overview and details of a piece of equipment and its maintenance schedule.
Analytical applications build on business intelligence using the capabilities of SAP HANA. They allow you to monitor key performance indicators (KPIs) of your business operations and to react immediately as changes occur. An example is the sales orders application, which immediately shows your sales representative the sales history from his customer, allowing him to take discount decisions immediately.
SAP Fiori apps consist of front-end components, which provide the user interface and the connection to the back end, and back-end components, which provide the data. The front-end components and the back-end components are delivered in separate products and must be installed in a system landscape that is enabled for SAP Fiori. There are multiple deployment options for the SAP Fiori components, each with their respective advantages and disadvantages. SAP Fiori applications are accessed through the SAP NetWeaver Gateway. The gateway consists of two components: SAP Gateway Foundation (SAP_GWFND) and User Interface Technology (SAP_UI). Both components are add-ons, which from NetWeaver version 7.4, are part of the SAP NetWeaver ABAP Stack. With NetWeaver 7.31, the components had to be deployed separately. This means that any system built on SAP NetWeaver, such as SAP ERP or SAP CRM, can be used to deploy SAP Fiori applications. SECURING FIORI 47 The following deployment options exist: central hub deployment, the embedded scenario and the cloud edition (see Figure 2.1). Figure 2.1: SAP Fiori deployment options
Central hub deployment
The central hub deployment is the preferred option. Here, SAP NetWeaver Gateway is installed as a separate system. The Fiori applications are deployed here and access the data on the back-end business systems, such as SAP ERP or SAP CRM. Although this option implies an extra system, thus a higher total cost of ownership (TCO), it enables a multi-back-end system scenario while ensuring a consistent look and feel for the different applications. The central hub can be considered a single point of access for all mobile applications. In addition, installing SAP NetWeaver Gateway on a separate system allows you to move the system behind or in front of the firewall depending on your current network topology and security requirements. SECURING FIORI 48
SAP NetWeaver is the basis of all ABAP-based SAP applications, regardless of whether you are talking about SAP ERP, SAP BW, or any of the others. As the gateway is an add-on for SAP NetWeaver, it is available on every ABAP-based business application. This means that it can be activated and that Fiori applications can be deployed on any system. This makes an extra system unnecessary. However, we do not recommend the embedded scenario as, in contrast to the central hub deployment, it results in Fiori applications being installed all over the place— negating the advantage of the single point of access for all mobile applications. The embedded scenario should only be considered during a proof of concept or when the deployment of mobile applications is going to be limited to a single SAP application such as SAP ERP. 2.1.3
The SAP Fiori cloud edition is a ready-to-use infrastructure which can serve as a front end while leaving the back-end systems on premise. The connection to the SAP Fiori Cloud is realized via SAP Cloud Connector, which must be installed on premise. The back-end components still have to be installed on the back-end systems.
Comparison of the deployment options
Table 2.1 compares the different deployment options. Every deployment option has its respective advantages and disadvantages. The importance of the pros and cons differ in every customer situation. We strongly recommend the central hub deployment option as it enables a single point of access to your mobile applications for SAP ERP, SAP BW, and many others, while at the same time ensuring the same look and feel. Due to its limitations and dependencies, the embedded scenario should only be considered in a proof-of-concept scenario.
As you are probably aware, S/4HANA has brought in various innovations in profitability analysis when it is account based COPA, like integrating ABCOPA into universal journal (Table ACDOCA) resulting in single source of truth, various Fiori reportings based on ABCOPA, automatic reconciliation of legal/management accounting with profitability analysis, drilldown from income statement to COPA characteristics, real-time splitting of COGS etc. Even from S/4HANA 1809, statistical pricing conditions in sales can also be transferred to accounting in extension ledgers.
So, there is a clear motivation for organizations to move to ABCOPA, but how to do that if client is converting their existing ECC system to S/4HANA? SAP does not offer any out of box solution for migration of CBCOPA to ABCOPA. I am listing here on what considerations business should check to do such CBCOPA to ABCOPA migration:
Determine the various flows relevant to COPA in your existing system
There can be different data sources in your system which are posting to CBCOPA, e.g.
Value flow from billing documents
For billing flow need to be analyzed to map the condition record/value field settings in pricing procedure of billing documents along with revenue account settings from VKOA account determination to target G/L account in ABCOPA.
Value flow from FI direct postings
Value field mapping in PA transfer structure for direct postings should be mapped to target G/L account in ABCOPA
Value flow from Orders/ Project Settlements
Value field mapping in PA transfer structure for Settlements should be mapped to target G/L account in ABCOPA
Value flow from internal allocations
Here the assessment cycle need to be read for various segments and then value field to target G/L account in ABCOPA to be mapped.
Value flow from production order variances
Here the mapping of value fields from relevant valuation strategy of cost component structure to target G/L account in ABCOPA need to be done.
For reading these objects, the controlling area can be derived from the company code and operating concern from the controlling area.
Define the Migration Settings
Separate document type for COPA migration
You can define a separate document type to segregate this COPA migration postings from others for easy identification, analysis and troubleshooting if needed.
GL Accounts/ Cost Elements for COPA Migration
Creation of migration G/L Accounts for each value field (There can be M:M relationship between value fields in CBCOPA and G/L accounts for ABCOPA):
GL with cost element to post nullifying impact to CBCOPA using value fields
Migration GL without cost element for contra posting to ABCOPA using cost elements
Entries from CE1XXXX table line items need to be migrated by making postings as below:
Cost Element GL DR/CR (Posting to CBCOPA)
Migration GL DR/CR (Posting to ABCOPA)
In such postings, the characteristics definition can be leveraged from the profitability segment definition from CE4XXXX matching profitability segment from CE1XXXX line item. Here the BDC recording/ BAPI based custom program should be developed and used to make migration postings in mass in controlled manner.
Due to heavy volume nature of the migration, the execution can be done company code wise to limit the volume and to have better control over execution. For many organizations we have seem millions of records in COPA, so identification of sub-set of COPA tables is critical for effective migration and subsequently for reconciliation.
The FI period should also be open as on the migration transfer date, as it will make the postings in financial accounting also.
Perform reconciliation between CBCOPA and migrated ABCOPA
After all relevant line items have been migrated, the sanity check should be done by comparing the various reports, like:
S_PL0_86000030 (G/L Account Balances (New)) for balances matching
Any client specific report created in KE30
KE91 report for line items
Such report extracts should be taken before execution of migration and after execution of migration for reconciliation and audit purpose.
As you might have guessed till now, that migration from CBCOPA to ABCOPA is not very straight forward and need business involvement for mapping confirmation, heavy development for identification of flows, reposting and reconciliation. Infosys Limited has developed a tool ABCD (Account Based COPA Designer) which can automate these steps with little interventions form business for mapping decisions, and Infosys provides this tool to all its’ S/4HANA customers interested to do CBCOPA to ABCOPA migration as part of S/4HANA project.
I wish for a world without any dispute, but the reality is that in every sphere of business there are chances of dispute. While running a business, customer disputes can impact organization’s financial stability and creditability.
SAP FSCM - Dispute Management helps in identifying and documenting disputes earlier in the payment cycle, track and monitor reasons that drive DSO (Days Sales Outstanding) and streamline process of dispute resolution while fully integrated with FI. Key process steps in FSCM - Dispute Management are as below:
FSCM - Dispute Management is not a new functionality and was existing in ECC environment also, so what’s really changing in FSCM is of great interest for organizations converting/adopting S/4HANA. Let’s see key changes in detail:
The core attributes of dispute case are the same in S/4HANA as in ECC, but the transaction processing has been simplified a lot in S/4HANA. See below highlights of such transactional changes:
Process Receivables (Fiori ID F0106)
Process receivables Fiori application is under FSCM - Collection Management, but fully integrated to FSCM - Dispute Management. E.g. in below Figure 1 for Process Receivables app, you have the option to see the dispute cases for the customer and can also create new dispute cases for outstanding items.
When creating the dispute case, the screen as showing in Figure 2 below is also having very specific relevant fields for user to create the dispute case:
Various disputes for the custom will appear at the same place as shown in Figure 3 below:
Manage Dispute Cases (Fiori ID F0702)
This fiori application helps in managing the dispute cases as shown in Figure 4 below. Here you can change the processor for multiple dispute cases in single go. Or can go inside the particular dispute case to change its’ attributes:
On opening a dispute case, you can change the other attributes like root cause code, person responsible, reason etc. as shown in Figure 5 below (The attributes are same as in ECC environment.)
Other Fiori Applications of GUI type:
There are various other fiori applications available, like Write off dispute cases (Fiori ID UDM_AUTOWRITEOFF) which are of GUI type, i.e. the look and feel is similar to SAP GUI transaction codes so the processing will not be impacted in S/4HANA environment for such transactions. For example, Figure 6 show the screen like GUI transaction UDM_AUTOWRITEOFF
Analytical power is core capability in S/4HANA and it provides a lot of out-of-the-box analytical applications, which shows the key KPIs on the tiles page as shown in Figure 7 below:
Let’s see some more drilldown into these analytical apps on what kind on information these are showing:
Overdue Receivables in Dispute (Fiori ID F2540)
Figure 8 show the overdue receivables in dispute cases and thus represent directly the working capital having less probability of realization. The analysis can be done from various angles, like by company code, by customer, by processor etc.:
Open Disputes (Fiori ID F1752)
Figure 9 show the open dispute cases and analysis can be done from various angles, like by customer, by processor etc.:
Solved Disputes (Fiori ID F2521)
Figure 10 show the closed dispute cases and analysis can be done from various angles, like by processor, by dispute case, by processor etc.:
Processing Days of Open Disputes (Fiori ID F2522)
Figure 11 show the processing days of open disputes and thus helps in prioritization of the cases for follow-up. The analysis can be done from various angles, like by coordinator, by dispute case, by customer etc.:
Role based menu in Fiori Launchpad
SAP S/4HANA provides Fiori Launchpad and thus all the related fiori applications for the user to manage dispute cases can be placed in single catalogue for ease of processing as illustrated in below Figure 12:
Integration with external application(s)
You can define external applications that allow you to process dispute cases outside of SAP Dispute Management. In the out-of-the-box settings, the SAP application “CRM Claims Management” is defined as an external application.
Lighting fast processing powered by HANA platform
FSCM – Dispute Management is also not secluded from proven processing speed benefit of HANA.
The processing in new transactional apps is faster because of simplified accounting data structure in backend, fewer data fields to fill and fewer steps to perform the dispute transactions.
In a nutshell, S/4HANA increases the utility for dispute management function and intensify the business case compared to using any third-party tool for this function.
With their latest product, SAP S/4HANA, SAP is revolutionizing how we approach finance by re-architecting data persistency and merging accounts and cost elements. This book offers a fundamental introduction to SAP S/4HANA Finance. Dive into the three pillars of innovation including SAP Accounting powered by SAP HANA, SAP Cash Management, and SAP BI Integrated Planning. Find out about the new configuration options, updated data model, and what this means for reporting in the future. Get a first-hand look at the new user interfaces in SAP Fiori. Review new universal journal, asset accounting, material ledger, and account-based profitability analysis functionality. Examine the steps required to migrate to SAP S/4HANA Finance and walk through the deployment options. By using practical examples, tips, and screenshots, this book helps readers to:
- Understand the basics of SAP S/4HANA Finance
- Explore the new architecture, configuration options, and SAP Fiori
- Examine SAP S/4HANA Finance migration steps
- Assess the impact on business processes
Backup Using Storage Snapshots
Storage snapshots are taken at storage level and are a backup or copy of all disks in a storage group at the same point in time.
Storage snapshots have the following benefits:
They can be created with minimal impact on the system. This is because storage snapshots are created in the storage system and do not consume database services.
Recovery from a storage snapshot is faster than recovery from a data backup.
An SAP HANA database can be recovered to a specified point in time using a storage snapshot only or using storage snapshot in combination with log backups. This enables you to recover to a specified point in time.
Storage snapshots and SAP HANA MDC
Backup and recovery using snapshots is not yet available for Multi Database Containers (MDC). There are two types of snapshots: database aware and database unaware snapshots.
Database Unaware Snapshots
A database unaware snapshot is a storage snapshot taken without notifying the database. When a recovery to the snapshot is done and the database restarted, the database assumes that a power-failure occurred and performs an online recovery. Database unaware snapshots are, by definition, inconsistent because the snapshot was taken while the database was running.
Database Aware Snapshots
A database aware snapshot is a storage snapshot which, when taken, notifies the database. As the database is warned, it can save a consistent state to disk. Database aware snapshots are, therefore, preferred over database unaware snapshots.
From an SAP HANA perspective, the storage snapshot captures the content of the SAP HANA data area at a particular point in time. Only use database aware snapshots. A snapshot is created by first creating an internal database snapshot. The database snapshot provides a view of the database with a consistent state at the point in time when it was created. The database snapshot is used to ensure the consistent state of the storage snapshot, regardless of the physical layout of the data area with respect to the number of disks, controllers, etc.
The following steps are needed to create a Storage Snapshot (see Figure 4.3):
Use the SAP HANA Studio or the Command Line Tool to initiate the database snapshot. DISASTER RECOVERY
Use storage tools to create a storage snapshot.
Use the SAP HANA Studio or the Command Line Tool to confirm that the storage snapshot was created successfully. An ID is written to the backup catalog and the snapshot is released.
Integration between SAP HANA and Storage
Database snapshots can be created via the SAP HANA Studio or the SQL command line. Both options are easy to use. For daily backups, the creation of the database snapshot will have to be automated by developing two scripts; one script to initiate the database snapshot and another to release the same snapshot. Both scripts will then have to be called from within the storage solution. Fortunately, most storage vendors provide such scripts by default.
This excerpt is from the SAP HANA – Implementation Guide complements of Espresso Tutorials!
A few months ago, we received a live chat on our website asking if we had a list of all SAP transaction codes – we didn’t and so I wrote a blog where you can download all SAP tcodes. This blog turned out to be the most visited article of the year – who would have known?
Then recently someone asked me about this list on our Twitter feed, specifically, if our list also included the transaction codes for SAP’s newest release, S/4 HANA (it didn’t). So I went into our S/4 system and downloaded all transaction codes to an Excel spreadsheet and compared the 2 lists.
As a reminder, a transaction code in SAP is a shortcut to an activity. For example, AS01 is the code to create a new fixed asset master record, FB01 lets you post a financial document or use ME21 to create a purchase order, etc. pp.
You can see a complete list of all transaction codes by displaying the contents of table TSTCT – this is where SAP stores all tcodes and their description in all installed languages.
It turns out that there are about 7600 new transactions in S/4 (and that’s on top of the roughly 100,000 tcodes that exist in SAP’s ECC 6 release) …that’s somewhat surprising to me – I would have guessed a lot higher.
Anyway, download all new SAP S/4 transaction codes and take a look for yourself.
Once downloaded, play around with it – for example, you can search for the string /ui2/ (=user interface 2) to find all new Fiori-related transaction codes quickly.
I hope you’ll find the list useful and, as always, please comment below and share this post.
This document is designed specifically during System Refresh for SCM system. This document will be helpful for SAP BASIS consultant to give all basic knowledge of SCM System Refresh. If we want to refresh our system legacy data with production data in quality or pre-production, we perform the system refresh. Example copying production data to quality or pre-production environment. Here production will be online during this activity.
SCM System Refresh:
In this process we copy the system from the production to lower environment like Pre-production or sandbox for below purposes.
Before any cutover/go live for testing.
Before Any upgrade of Enhancement package or support pack stack upgrade or version upgrade.
Companies want their SAP test system to look like their production system for better testing.
Step 1 Screen shot of some transaction to compare after refresh
STMS: Transport Domain, System Overview, Transport Configuration, Transport Parameter
SCC4: Client status open / close.
AL11: Transport directory location.
RZ03: Operation Mode, Current work process status, instance and startup profile location rom instance status
WE20: Partner profile.
SLICENSE for later use of Hardware Key
SLDAPICUST note which system
STRUSTSSO2 see which tickets are active
Step 2 Screen shot of some table using SE16
USR02 Number of entries (to compare after import)
Step 3 SM37: download all BTC info into .XLS
Step 4 SPAD: Printer Export as .txt format
Step 5 Export Background Jobs:
Export tables: exp: then put in a dump file
Step 6 RFC export:
Step 7 Export AL11 table USER_DIR
Step 8 SCC8 User Master Data Export
Users master data of that client exported so the user of production not reflect to the target system.
Up to 3 requests are created, depending on the data selected and available:
1. "XXXKO00012" for transporting cross-client data, if you have selected this
2. "XXXKT00012" for transporting client-specific data
3. "XXXKX00012" for transporting client-specific texts, provided texts are available in this client
2. Choose profile SAP_USER and Target system as same client.
3. Schedule job immediately with no printer Dialog
5. SCC3 go to Export
6. Wait till finished.
Removing data files of XXX & coping PRD data (user orasid)
rm -r sapdata1
rm -r sapdata2
rm -r sapdata3
rm -r sapdata4
rm -r origlogA
rm -r origlogB
rm -r mirrlogA
rm -r mirrlogB
cd /oracle/XXX/ (we are here only so no need to do it (user orasid))
scp -pr /tmpmount/sapdata1 . &
scp -pr /tmpmount/sapdata2 . &
scp -pr /tmpmount/sapdata3 . &
scp -pr /tmpmount/sapdata4 . &
scp -pr /tmpmount/origlogA . &
scp -pr /tmpmount/origlogB . &
scp -pr /tmpmount/mirrlogA . &
scp -pr /tmpmount/mirrlogB . &
STEP 1 Create Control File on Source System
su – ora<Target SID>
sqlplus / nolog
connect / as sysdba
alter database backup controlfile to trace;
cp <latest-file>.trc SIDcontrol.sql
scp SIDcontrol.sql to Source system
STep 2 Edit Control File on Target System
Chown orasid:dba SIDcontrol.sql
Step 3 Login to database & recover the database from controlfile.
oraxxx>Sqlplus / as sysdba;
Recover database using backup controlfile until cancel;
alter database open resetlogs;
alter tablespace PSAPTEMP add tempfile '/oracle/XXX/sapdata1/temp_1/temp.data1' size 540M reuse autoextend on next 20000000 maxsize 10000M;
alter database noarchivelog;
alter database open;
alter database rename global_name to XXX
Step 4 We need to drop the “OPS$” users of PRD system & need to create the users with current SID.
1. Select username from dba_users; (it will show list of all OPS$ user and create this user again with XXX sid and provide the role related to it)
2. Drop user OPS$ORAXXP cascade;
3. Drop user OPS$XXPADM cascade;
4. Drop user OPS$SAPSERVICEXXP cascade;
5. Create user OPS$ORAXXX identified externally externally default tablespace SYSTEM temporary Tablespace PSAPREMP;
6. Create user OPS$XXXADM identified externally externally default tablespace SYSTEM temporary Tablespace PSAPREMP;
7. Create user OPS$SAPSERVICEXXX identified externally externally default tablespace SYSTEM temporary Tablespace PSAPREMP;
8. Grant DBA, Connect, Resource to OPS$XXXADM;
9. Grant exp_full_database to OPS$ORAXXX;
10. Create SYNONYS "OPS$SERVICEXXX".SAPUSER for "OPS$XXXADM".SAPUSER;
11. Create table "OPS$XXXADM".SAPUSER (USERID VAARCHAR2 (256), PASWORD VARCHAR (256));
12 INSERT INTO "OPS$XXXADM".SAPUSER ('SAPSR3', '*******');
14. @Sapmnt/xxx/exe/sapconn_role.sql (it will disconnect SQLPLUS)
15.sqlplus / as sysdba;
16. @Sapmnt/xxx/exe/sapdba_role.sql SR3 (it will disconnect SQLPLUS)
18. su - xxxadm
19. R3Trans -d
Step 5 Drop PRD tables & reimport the exported tables of the old system.
1.sqlplus / as sysdba;
3.Drop table sapsr3.USER_DIR;
Step 6 After importing Old table data login to SAP & do post stepsSE06
Database copy or database migration
Do u want to reinstall CTS--> yes
Source system of Database copy--> XXP
Change original from XXP to XXX
Change the transport System not configured--> ok
Do u want to Change original from XXP to XXX --> YES
Delete TMS Configuration--> NO
Delete the original version of transport routes--> NO
Step 7 STMS: Restore as old system
Description: XXX SAndbox
Step 8 BDLS: New logical system Name: xxxclnt500
Conversion of client dependent and client independent Tables
Table to be converted: **
Continue with conversion Anyway. YES
Step 9 Import user master
Step 10 Delete cancelled and Finished jobs
Go to SE38 Run RSBTCDEL. Execute in Background -> Create a Variant named “REFRESH” with the following properties:
Jobs from all users (*), Days 01, Fill three boxes at the bottom with ‘X’.
Be sure to check ‘delete with forced mode’
Save the variant, go a step back and click “Execute Immediately.”
The Refresh process will provide latest data in quality or pre-production environment which will helpful in rectifying error before production. I recommend to perform this activity every quarter or before any major changes and go live.
If you use Material Ledger’s Actual Costing, then you would know that the Post Closing Step creates accounting documents depending on how the variances for the Material have been distributed. For example, a material with a price (or exchange rate) difference of $100 could be sold, scrapped, used in a production order that is complete, used in a production order that is not complete, transferred to another plant, or left in inventory. And this only refers to the differences that are created on the material itself (single-level), and not the differences that are transferred from other materials (multilevel) which have their own slew of Material Ledger postings.
Because of this, it is easy to be overwhelmed by the volume of postings that are created by the Material Ledger’s closing entry and what they mean. Some companies choose to label the General ledger accounts appropriately to indicate what the posting is for, but if you do not understand the posting, it is easy to incorrectly label the General Ledger account. Also, you may not need a separate general ledger account for each scenario as that may lead to more General Ledger accounts than you need, and may create even more confusion.
Believe it or not, SAP does try to provide guidance on what each Material Ledger posting is for. It does so by inserting texts into each line item posting to indicate what it is (as described in SAP Note 2397606). The problem, is that these text explanations are sometimes cryptic and do not provide much clarity on what the posting is for. Also, there is very little information (online or elsewhere) that gives a more meaningful explanation for these text descriptions.
In the table below, I will try to explain (as best as I can) what texts are shown in the FI line items created by Material Ledger and what they mean.
Many users wonder how SAP calculates the Net Present Value or Mark-to-market figures in transaction JBRX. This post seeks to help users gain an understanding and interpret the results that are generated when one executes the transaction.
An investment transaction generates a series of future cash-flows and the Net Present Value (NPV) is the present value of future cash-flows.
Invest in a €500000 Fixed Term Deposit for one year at a monthly interest rate of 12%.
Cash-flows are calculated as below:
To calculate the NPV use transaction code JBRX.
On the selection screen, enter the following information:
Evaluation Type- This carries the relevant evaluation parameters that will be used in the calculation of the NPV.
Evaluation Date- This is the date that determines the market data that will be used in the NPV calculation.
Horizon- This is a future date on which future cash-flows are discounted. The horizon date must be greater than the evaluation date.
You can also select to run the NPV analysis using simulated transactions and you can limit the NPV Analysis bases on different characteristics, for example transaction number.
Once you are happy with your selection criteria, you click Execute.
The system now displays the transaction which we entered in the selection characteristics and both the Nominal amount and the NPV amount are displayed as below:
Highlight the transaction and click on Detail log to view the yield curves that have been used in the NPV calculation. The Yield Curve Framework is used to maintain reference interest rates and enter their values. On the basis of the reference interest rates, you can create yield curves to help you determine mark-to-market net present values with the price calculator.
This transaction will be discounted based on the reference interest rates maintained on yield curve 1002.
For detailed calculation parameters, click on the transaction again and select the icon Calculation Bases.
The Calculation Bases screen first shows the fixed cash-flows based on the fixed interest rate of 12% maintained on the transaction. The cash-flows are calculated based on the Interest calculation method specified on the transaction, Act/365 in this case. For the first cash-flow calculation would be: €500,000.00*12 %*( 28/365) =4602.74.
For the NPV Calculation, the system takes the Discounting Factor calculated based on the 1002 yield curve values and applies that to the fixed cash-flow. For the first line, the calculation would be:
These discounted cash-flows are then added up and the total NPV of the transaction is €549,570.36.
Get the background on New Asset Accounting and S/4HANA, and run through some of the key changes that were introduced which distinguish it from Classic Asset Accounting. Next, we will go into some more detail on some of the larger areas, for example how the depreciation areas work with the ledgers to record the different accounting principles. You may already have heard of the Universal Journal, which is one of the biggest innovations in Finance in S/HANA. We will explain what that is and how New Asset Accounting integrates with it and how the new asset transactions work. I will also run through the depreciation and finally I will briefly touch on the migration of assets.
Learn about the use of bill of materials in a PLM context. Material BOM types, the elements at header and item level and how it can be used in a PLM context.
Need and ways to optimize your SAP system.
- How customers have suffered from a lack of proper utilization of SAP.
- Examples of how a deviation from Best Practices could adversely impact an organization.
- Methodology that should be used when performing an optimization project.
- How a proper system optimization will help companies with migrating to S/4 HANA 5.
This article covers the basic functionality of Project Systems and how it relates in MRP to PS and other logistic integration.
Project System is a highly integrated module whose purpose is more for consolidation and reports.
As its name suggest, it is project oriented, something that has a timeline and will end that you want to track by breakdown basis. Though there are cases where we can just make the project as a receiver for a departmental cost for example, RnD of 2016.
There are 6 main setups for PS
Costing – Project that exist only for the purpose of planning costs at WBS level.
Assets – Project that receive Capital funds from Investment Programs.
Sales – Project that are customer focused.
Manufacturing – Project that are Material / Logistic focused.
Statistical – Project that do not plan or receive costs.
Maintenance – Project that exist for managing Equipment.
Different setup caters for the different industry and usages of the module. Each will have a different configuration and integration with other module. In terms of functionality, PS is used to:
Receive and consolidate Costs and Revenue in WBS (leveled and labeled) manner. A WBS is a CO Object, thus can be a receiver for settlement.
Act as Planning tool for project schedule
Act as demand management
Act as investment and asset management
For logistic execution in term of procurement and scheduling.
For No 2, companies will normally use MS Project to do the scheduling and interface it with SAP PS. There is also a standard interface available that can easily integrate these two applications (Open PS)
To illustrate the core functionality of Cost & Revenue allocation, (example above) where you want to categorize the cost of building a car into 3 categories. You can easily separate and are able to see the breakdown of the cost of the car by building this structure.
You can then attach a network to the engine WBS. With this network we can create activities which enable the creation of elements, where the logistic activities are maintained, as you can see in the picture above.
Work – Defines the internal work done by your own labor. This will use a Work Center and activity type to determine the cost. The rate can be maintain in KP26 (Activity type/Price Planning).
External – Defines the external work done by contractors, outside the company. This will require Vendor, Outline Agreement(optional) and cost element. Price is maintained manually.
Costs – Defines auxiliary costs. This will use a cost element and cost is entered directly.
Service – Defines work/ services using a service master.
Material Component – Specifies the activity requires a Material, and demand will be entered. Based on the MRP View 2 – Procurement Type, purchase requisition / planned order will be created.
By putting the Material component into the network activity, it will act as a demand for the material.
There are 5 default standard options to choose from
PEV will enter the demand as PIR and become stock for the WBS.
PF will trigger purchase requisition its for non stock. Meaning it will be consumed upon GR.
PFS is the same as PF, except its to be deliver directly to the customer or an address. You can choose to select customer address / vendor address or any location from central address management.
PFV will enter the demand as PIR, but its non-stock.
WE will also enter demand as PIR, stock item, at Plant level.
You can see the config in SPRO-Project System-Material-Procurement-Define Procurement Indicators for Material Components.
By putting 2 KG Demand with requirement date 10.08.2017, it will transfer as a PIR and we can view in MD04 as below.
Run MRP and the system will generate procurement proposals.
For investment and asset management, it is mostly used in Oil and Gas industry where the project is a AuC (Asset under construction) which when completed, will become Fixed Asset and starts depreciation.
In conclusion, Project System core function is for Revenue / Cost reporting, however it also can be integrated with logistic functionalities and from there derives the project in all aspects. It’s a highly useful module for a company that operates on Project Basis. Below is an example of Customer Project Setup (uses Cost, Sales and Manufacturing setup).
Project created with WBS and network. Cost planning, logistic, revenue are done. Baseline budget is set and project is executed. Project runs and confirmation, Goods Receipts starts. This will be repeated until project ends. From there customer can be billed based on the Sales Order using Milestone/Billing plan. Settlement done at the end for closing. If additional items are requested from customer, the whole process will retrigger with different SO and WBS (can be in the same project). After project delivery, warranty period will starts and any cost incur will be collected under warranty. Warranty ends and settlement is done at the end of the project closing.
By Lori Moriarty at Michael Management Corporation
If you want to stay relevant as a technically savvy professional, you need continuing SAP training.
Because SAP is the 500-pound gorilla of ERP software. According to SAP they have over 350,000 customers in 180 countries and 15,000 partner companies worldwide. 87% of Forbes Global 2000 are SAP customers. There you go. Odds are high that you are going to be working for a company that has SAP.
Are you currently an SAP end user?
Great. Now it is time to be trained. Many companies use peer-to-peer training methods to get the new end users up and running. Though it may be enough to get the job done, there is so much more to learn. Imagine being able to use menu short-cuts that you did not know exist. With proper training, you can become your department’s Super User and position yourself for a promotion or raise based on your new level of proficiency. There may be a better (i.e. faster) way to do your reports and queries. You may never know these processes exist until you take an SAP training course.
Looking for a new position or a new job?
Your supervisors and future hiring managers will look at your SAP qualifications. Yes, of course, it is great to have experience as an SAP end use user. It is even better to show you went above and beyond and got professional skills training. You can take training for your specific career field. If you are in the human resources field, then earn a certificate as an HR Administrator, HR Manager or HR Payroll Manager. This kind of training is what will put your resume above the rest.
Want an impressive resume on LinkedIn?
Then just upload that hard-earned SAP Training certificate and let them know exactly how serious you are about your career. When you become a Certified SAP Professional, your credentials can be verified on-line by a third party. Remember to continuously update your LinkedIn profile as you change positions and take on more job duties. Reach out to others in your field and build your network because you never know who may be hiring. When you add your new SAP Certification be sure that your “share profile changes” button is in the ON position so that everyone can see your updates.
How would you like to make more money?
According to PayScale.com SAP Certified consultants earn more than consultants without certification. 137 consultants with SAP training reported their earnings at $75K to $126K and 122 SAP consultants who did not have SAP training or certification reported $49K to $78K. Does this mean that the consultants without training are bad? Of course not, they may be even better at the job than some of the higher paid consultants. The point here is that perception is key. More training looks better.
What is motivating you to start continue your SAP training?
One topic that is important in SD, but is often overlooked is the use of units of measure in custom ABAP programming. In SAP, units of measure are stored in a series of tables that begin with T006:
The table which controls various language translations of a unit of measure is table T006A. I’ll explain more on that shortly.
SAP does a lot of work behind the scenes that most developers rarely ever see. When calling up any item in SAP, a delivery for example, SAP will perform conversions of their data before displaying it to the user. This is done for several fields, but one of those fields is the unit of measure of the delivery item. SAP will convert the unit into what’s known as the external, or commercial unit of measure. This is the unit of measure that is displayed to the user. On the flipside, when a user enters a unit of measure into the delivery item and saves that data, SAP assumes that the user is entering the unit as the external or commercial unit of measure and will convert that entry into the internal unit of measure, or the unit of measure that is stored in SAP.
The difference between the internal and external units of measure are sometimes very hard to distinguish as they often look the same in English. What it boils down to is that the internal unit of measure is language independent, meaning that it is the same in ALL languages. The external or commercial unit of measure varies depending upon the language it is being viewed from. A developer may never know this when dealing with units such as KG or M2 because these are typically stored the same way in all languages.
In the example above, the first unit is the internal unit, while the second unit (MSEH3) is the commercial unit. Both the internal and commercial units are the same in all languages. The problem begins to occur with units that vary in different languages.
One example is a crate. In German, a crate is “Kiste”, so the internal unit of measure is “KI”. Since SAP is a German system, the commercial unit is also “KI”. However, in English, we would not identify it as a “KI”, but rather something like “CRT”. We now have a difference. Also, in this example, the Japanese unit is in kanji:
One mistake a developer may do is create a custom field using the external, or commercial unit of measure. Another is to define a unit of measure variable as a CHAR3 with the intent of writing this value to a text file. This will cause the field to take the form of whatever is found in the table. If a user has pulled this data straight from a delivery and writes it directly to the CHAR3 variable, SAP hasn’t converted that text yet, and because the data has been assigned a variable without a conversion exit, it never will be converted.
For illustration purposes, I built a custom program and table. This program takes an input, and writes the data to the screen. Here are few examples.
Note that I’ve entered the commercial unit of measure here.
What just happened? The unit of measure was converted when assigned the field in the table to be added to the database, however, when it was written to a character variable, it was not converted back before output.
This example writes the unit of measure from the internal unit of measure field
Again, what happens here is that SAP converts it to the internal unit so that it can store the correct value in the table, however, before it is written to the screen, it is converted to the commercial unit of measure automatically.
Simply put, the reason this happened is because the commercial unit of measure (MSEH3) does not contain a user exit. Therefore, when the converted unit of measure is assigned to this field, it is currently in its internal format, and does not get converted to it external format. The issue is that SAP will not automatically perform this conversion when assigning values to variables without the proper conversion exit.
Let’s take another issue that can occur with improper conversions. This can be seen when a developer writes a custom program to mass load data into a table. As an example, I made modifications to my existing program to also store records.
As you can see, when you look at the records SE16N, the new record from the above example appears.
However, when attempting to find the record:
This is caused by storing the improper unit of measure here. The commercial unit was not converted before it was added to the database. So, when running SE16N, when the user enters the unit of measure in the search criteria, SAP automatically converts that into the internal unit of measure. Since the commercial unit is the one that got stored, SAP cannot find a match.
When writing custom programs, a developer must take caution when dealing with units of measure. In many cases, it will be responsibility of the developer to perform the conversion of the units manually, especially when failing to use the correct variable with the conversion exit built in. Fortunately, SAP is good at providing the needed tools to accomplish this. From SE11, the data element for the internal unit of measure(MSEHI) has the domain MEINS.
Double clicking the domain name and clicking on the definition tab will show the conversion routing of the data element.
In this example, it is “CUNIT”. Double-clicking the conversion routine name will show the INPUT and OUTPUT conversion routines for this field.
The typical naming convention of a conversion routine is “CONVERSION_EXIT_” followed by the conversion exit name, followed by a final piece to explain the purpose of the function.
For example, CONVERSION_EXIT_CUNIT_OUTPUT is the conversion exit used to convert an internal unit of measure to the commercial unit for user display, while CONVERSION_EXIT_CUNIT_INPUT is the conversion exit used to convert a commercial unit entered by the user to the SAP internal unit of measure when used for storage. As a rule of thumb, OUTPUT is used data is written to a screen, document, etc. While INPUT is used when using that data as input for interacting with SAP table data.
When using the conversion exit for units of measure, one key piece of information is the language. When using the conversion exits, it is best to avoid constants for the language and to use the system variable SY-LANGU. This will always provide the logon language when converting units of measure.
Going back to our example, I’ve added the needed conversion routines to my code. Now, let’s run the report again:
As you can see, both the internal and external units of measure show correctly, and SE16 is able to successfully find the record.
It’s a little confusing at first to distinguish between the internal and external units of measure, but after some time, this will become very clear and will allow for any custom development to handle unit of measure information coming and going.