Quantcast
Channel: SAP Business Rules Management
Viewing all 119 articles
Browse latest View live

3 Simple Strategies for building Adaptable Decision and Rules in #NetWeaver DSM & #BRFplus

0
0

Life belongs to the living, and he who lives must be prepared for changes.

Johann Wolfgang von Goethe

Image courtesy of imagerymajestic / FreeDigitalPhotos.netID-100233553.jpg 


This week I had the joy of running a SAP NetWeaver Decision Service Management workshop - titled Cutting the IT Backlog via Do-It-Yourself Business Rules and Decision Services - at the Mastering SAP Technologies 2014 conference in Melbourne.  Not only is it great to get together with like-minded folk who really understand the potential of business rules, I find preparing for annual conferences is a great opportunity to reflect on lessons learned over the past year.  So this blog captures a few of those lessons learned on how to build adaptations directly into your business rules.

 

Now these lessons have come from some projects I have been working on where we are dealing with implementing government legislation using decision services and business rules.  After nearly a year of dealing with legislation I have come to following conclusions:

  • Legislation includes the most complex and complicated decisions and business rules any organization is likely to face
  • Legislation is prime example of business rules that adapt over time, as new legislation and legislative amendments are introduced throughout the year at irregular intervals in response to changing markets, high profile news events, political initiatives, and lobbying by strategic interest groups
  • Legislation is a prime example of decisions that may need to be applied retrospectively, e.g. when a citizen raises a legal appeal, it is often necessary to re-execute the decision under the original legislation with adjusted inputs based on the outcome of the appeal, and such appeals often take weeks, months or sometimes even years before the outcome is decided
  • Most business and IT people are not very good at including adaptation strategies in their rules design – the focus is always on what happens now

 


So here a few simple strategies that we have used time and time again to cope with even quite complex adaptations.  All of these can easily be built into decision services and business rules from the beginning of your design.


Minor adaptations: Add an Effective Date column to your Decision Table

When your focus is on what happens now, it’s easy to think of many thresholds, limits, rates and factors as unchanging.  For example I can’t count the number of times I have seen workflows written with static deadlines, e.g. Escalation Date = Request Submission Date + 2 days.


More often than not, someone later wants to increase the deadline temporarily to cope with national public holidays, or the summer office close-down period.


It’s a not uncommon problem to forget that things can change or that exceptions can occur which vary the usual rate.  In legislation one of the common assumptions of thresholds that never change are those built around a person’s age. E.g. A child is someone under 18 years of age; an old-age pensioner is someone over 65. 


We tend to forget that these limits are actually quite arbitrary, and are a matter of current policy and legislation which can change.

 

Last November the Australian Productivity Commission proposed raising eligibility for the old-age pension from 65 years to 70 years, and since then there have been a number of media soundbites from Australian federal government politicians suggesting that this proposal is being taken very seriously.

 

Now the age at which a person is eligible for an Australian old-age pension has been 65 years old for over 100 years, so it’s completely understandable that both business and IT would think of the pension age as unchanging, but we simply can’t guarantee that will be the case in the future.  To make things worse, media reports have suggested that increase in pension age will be gradual.  The previous federal government has already introduced gradual changes to increase the retirement age by 6 months every 2 years from 2017, and it’s likely the pension age might be increased in a similar way.

 

If we used a constant or entered a direct value of 65 for the pension age in our business rules we would have a lot of rework to adapt our decision services and business rules if and when such changes are introduced.  But if we instead store the value in a decision table and simply add an effective date column, we can easily cope with these sorts of changes, without having to worry about when or if they will be introduced.  


By using a decision table with a date column, all anyone will have to do to adapt the pension age if such changes are legislated, is to add new row to the decision table; and if the changes don’t happen we simply only have one row with effective date <= 31.12.9999 and the rules will still work fine. There’s also very little impact on our design to add this from the beginning.  We just add a simple expression to evaluate to read the pension age from the table and store in a Ruleset variable for use in any subsequent rules.


Such a decision table might look like this:

decisiontable age pensions.JPG

 

Of course this approach works equally well for those limit, thresholds, rates and factors that we do expect to change – like tax rates, inflation factors, superannuation caps, etc.


Major adaptations: Ruleset Preconditions

The thing about major adaptations is that they are nearly always introduced on a specific date.  With government legislation, legislation and legislative amendments are always introduced on a publically announced date.  For us the Australian Commonwealth law and legislation website http://www.comlaw.gov.au/ is the fount of all knowledge for such dates. In the private sector, major changes are also often introduced as at specific date – the introduction date of a new initiative or project, the start of a new fiscal year or new calendar year.


There’s another common consideration for major adaptations, i.e. while the underlying business rules of the decision service may need to be radically different from the previous version, often there is a lot of opportunity to reuse existing business rules.


Now you could just change your whole Ruleset and use the BRFplus versioning to decide which version of your decision service to run.  That’s fine so long as the old rules were perfect in every way. Not so great if as a result of a legal appeal, complaint, or objection, an adjustment is needed to the old rules – because of course you can’t fix the old version and redeploy it again under the same deployment date once those rules have been superseded by a new version.


So we found a smarter, more flexible, and more visible way to ensure the correct rules are considered, and it’s made executing rules retrospectively much easier as well: Rulesets with preconditions. It works like this:


Each major adaptation, in our case each major legislation act or amendment, has its own ruleset containing the rules applicable to that adaptation and the sequence they are to be executed.  So long as all the rulesets are contained within the same Rules Application it’s easy to reuse rules, and their underlying expressions and actions, in both rulesets.


Usually somewhere in the inputs of your decision service (aka BRFplus function) you will have passed a date that can be used to decide which version of the business rules should be used, e.g. the submission date of an application or request.


Before running each ruleset, that date can be compared to the dates for which the particular adaptation applies.  This makes running the decision service for future and retrospective dates very easy – just pass the appropriate date as one of the decision service inputs, and - Hey Presto! - the correct rules are applied. 


We can even mix rulesets with and without preconditions.  So if there is a truly common ruleset that runs pre or post the others, all that needs to be done is to set the ruleset execution priority and each ruleset will be applied in the correct sequence.


Tip: Restrictions of preconditions is that whatever needs to be checked must be part of the decision service context, i.e. part of the inputs and outputs of the BRFplus function itself, and the precondition itself must be expressed as a Value Range expression.


ruleset preconditions.JPG


Back to the Future: Constants for Milestone Dates

Now this is one strategy that was hard won.  While our design had catered for most things that could change, there are always those true constants that never do.   Like the introduction date of past legislation or a past initiative. It’s never going to change.  So at first we were entering some of these as direct values in conditions, cells of decision tables, values in formulas, preconditions in rulesets, etc.


We only realised we were creating a problem for ourselves when we, admittedly without thinking it through, started using the same approach for an upcoming milestone date.  It was a legislation date, it was already announced, and so it wasn’t going to change, right? Wrong! Politics being what it is, certain political lobbies were barracking for more time to introduce the changes, the politicians agreed and all of a sudden our set-in-stone legislation date was crumbling.  Worse, the new date was in flux – we were told it would either be original legislation date + 1 month, or original legislation date + 2 months.  Not happy!


Back to the design drawing board and we realised we needed to go through and change all those direct values to use a single constant expression.  All well and good, but then something interesting happened…


We were discussing the testing phase in depth with the various business groups and the data migration team, and we started to realise that by moving from direct values to a constant we had accidentally discovered an easy way to handle future and retrospective testing; and improve the meaning and visibility of our rules at the same time.


Putting a milestone date in a constant meant we could give a meaningful name to the constant, e.g. Live Longer Act.  Because we reused that constant across a wide range of expressions – e.g. in conditions, decision tables, formulas – all of a sudden all of these expressions became that much more business meaningful. For instance, instead of a row in a decision table indicating a rate applied from 01.10.2014, now it read as the rate applies to the Live Longer Act.  Not only more meaningful but greatly improved the accuracy of assigning the correct date to the correct expression.


notable constants.JPG


When we came to testing, we had a lot of data being migrated that we could use as test data, but of course it was all for past dates. So how could we test our future rules without creating or copying a lot of data to future dates? Easy! All we had to do was adjust the date in the constant, and everywhere that constant was referenced was automatically adjusted as well.  So all those conditions, decision tables, preconditions, formulas, etc. were now pointing to 01.03.2013 or whatever date we chose, and we could then run before and after testing on past data just using the constant to control whether we were running future or retrospective rules.  Magic!


P.S. Don’t forget to set your constant back to the real date before you deploy your decision services to production!


Anyone else have a strategy to share?

These 3 simple strategies have made all the difference for us, but I am sure there are many other approaches that could be used.  We are so privileged to be part of the SAP Community Network, and I know the Business Rules Management community is very active in helping others.  So I would be delighted to hear how others have tackled building rules for adaptation in your comments, or even whether you have used similar approaches successfully on your site.


DSAG Topic Day BRFplus/DSM

0
0

It was no Aprils‘ fools joke: at April 1 the BRFplus/DSM topic day of the German SAP user group DSAG was held in St. Leon Rot. The idea for this day started on SCN. Last December I realized that I would like to get in contact with other experts in the area of business rules and blogged about this ideaDecision Management in the Enterprise – It’s Time for a Conference. The blog was read and soon we had a phone conference and asked ourselves questions about the scope: are there enough people who want to join? Perhaps it would be better to organize a joint conference about DSM and BRM/PO? My position was clear: I was convinced that many people will join and there are so many interesting stories to tell that it would be better to separate both topics which are both important. So the DSAG working group custom development in ABAP and Java decided to ask their members and many people showed interest. Finding a time slot was a challenge but luckily a topic day was cancelled and we got the time slot.

 

Steffen Pietsch, the leader of the working group, took over and organized an awesome day at St. Leon Rot near Walldorf. Kudos to him!

DSAG.PNG

The conference started with an introduction and online demonstration of BRFplus and DSM presented by Wolfgang Schaper. We learned about the current features of DSM/BRFplus and the roadmap. Of course there was an online demonstration about the features of BRFplus of rule deployment features. Wolfgang Schaper was well prepared and answered all questions by demonstrating the features online.

 

Customer Presentations

 

After that three customers talked about their BRFplus/DSM use cases. I started with a talk about usage of BRFplus in automation of document processes: automated evaluation of scanned questionnaires contained complex information insurance claims using BRFplus. This is challenging for many reasons:

  • All stakeholders have to collaborate: business process experts, SAP experts, experts  from the document reading and scanning infrastructure and many more.
  • At first process and quality goals have to defined and KPIs how to measure them.
  • The evaluation using rule systems can become quite complex since we work on data extracted from questionnaires. In fact we have been quite successful but we can get to the point where the rule systems is nearly as good as manual processing and but manual clearing case is necessary. At this point you can get to the point the rule system creates task for the business user with concise information what to do as well as automated activities.
  • This is a typical example for a rule based system. Creating those systems is quite challenging but BRFplus but has many outstanding features like extensibility and an API that help to create those solutions.

 

In the next customer talk Günther Scheper from Claas presented some use cases from BRFplus. It became clear that every ABAP developer can successfully use BRFplus. Furthermore it helps to make the communication between business and IT experts more efficiently. At the end of this talk was a great discussion about BRFplus design styles and idioms and even “unusal” use cases like usage BRFplus for control UI controls.

 

The last customer talk was really awesome. Gerhard Kregel and his team from Voestalpineshowed that the implemention of  the complete knowledge of experts in steel production and logistics in DSM. They apply daily changes of the rule sets and deploy those rules together with master data(!) using DSM. Of course there is a mature governance process and they put an own UI on top of DSM and integrated a document management system for revision safe documentation of rule and rule changes. This shows that you can migrate a very mature and sophisticated framework or business rules into the SAP world by keeping most of the processes by using extensibility of BRFplus. They even did an online presentation and showed their AddOn on DSM deployed a set of rules.

 

In the last talk Carsten Ziegler presented a talk about methodology of successful BRFplus/DSM implementation and presented a firework a firework of best practices. This was really awesome and many people including myself took notes and now I’m adding them to our internal BRFplus development guidelines.

DSAG Twitter.PNG

 

If you follow me on Twitter you can read some of my note of this day on my Twitter timeline.

 

Why was this event a success?

 

DSAG did a very good job in the preparation and selected really good mixture of different talks so that everyone got inspiration for his daily work. For me it was important that with voestalpine a business expert (and an only IT guys like me) did a presentation and talked about decision management. Efficient implementation of business rules means that business and IT work together successfully.

 

If you decide to organize a BRFplus/DSM topic day please be prepared that more people will join than you expected. We ran out of chairs twice and had to go out to get new. Decision Management is a hot topic and people want to learn about and share their experience. Moreover there was enough time for attendees to chat and to extend their business network.

 

What comes next?

 

I hope that there will be follow up DSAG meeting about DSM/BRFplus next year. I hope they will be a similar variety of customer talks. Maybe there will be even a keyword from a business person who challenges the IT crowd with his expectations concerning rules, rules technology and governance. Maybe I’ll offer a talk about rules and HANA or prediction.

 

Personally I’m still working on the topic of process automation using BRFplus. And if I find the time I’ll blog about my experience. I hope I can make it to SAP Inside Track Hamburg and perhaps I’ll talk about this or another BRFplus/DSM topic. I made the experience that working with ambiguous data is quite difficult and at the moment I’m working on a small (more or less mathematical) approach to this topic together with a researcher from Belfast. If this work is finished it is likely that I share my experience on SCN.

 

At the moment I’m thinking how DSM especially in connection with prediction and HANA expressions can be used in SAP InnoJams because I consider pushdown of business logic to HANA as very promising way of code pushdown.

 

But these are just the things I am thinking about. But now it’s time for others to step forward and to organize similar events in other SAP user groups. It's like Steffen Pietsch told at the event of the event: the user groups need you and your experience.

 

I’m convinced that these event will be a success because business rules are mission critical, make ERP transparent and better to change and the usage can reduce development costs. As I already told you I consider HANA expressions of DSM most important.

April 2014: Reader’s Digest for SAP NetWeaver Decision Service Management (NW DSM) and Business Rule Framework plus (BRFplus)

0
0

Once again it’s time for a new blog about the news and contributions in the space of SAP NetWeaver Decision Service Management (NW DSM) and Business Rule Framework plus (BRFplus).


SAP NetWeaver Decision Service Management Deployment Workflow

With the help of SAP consultant and BRFplus veteran, Tibor Molnar, we were finally able to release a document that comprehensively describes how you can use SAP Business Workflow to set up a custom approval and release process for the deployment of business rule updates.

 

SAP Application Interface Framework with SAP NetWeaver Decision Service Management

Based on the experience gained in projects that combined the capabilities of SAP Application Interface Framework and NW DSM, Richard Feco has produced a little promotional video. This could be an interesting combination for many customers.

 

Do you make decisions? Or just rules?

One more blog from Jocelyn Dart! As always, it is a must read for subject experts. Jocelyn introduces to the decision service idea very nicely.

 

3 Simple Strategies for building Adaptable Decision and Rules in #NetWeaver DSM & #BRFplus

Jocelyn Dart again. In this blog Joceyln shares some thoughts and best practices about  adaptability.

 

New and Updated SAP NetWeaver Decision Service Management (DSM) Roadmap

Eduardo Chiocconi recently published the 2014 roadmap deck for NW DSM. Almost all of the topics in the 2013 roadmap have already been shipped. In the roadmap for 2014 and beyond we see improvements in test automation, predictive analytics, and HANA (deployment, exchange). While the latter is a long term development, you will soon hear more about the first two (test automation and predictive analytics). I have just finished the internal development for model import from KXEN InfiniteInsight into NW DSM/BRFplus.

 

SAP NetWeaver Decision Service Management Trace Visualization

In a collaborative effort, we were able to provide a document that explains the capabilities of the improved rule execution trace. It also shows how to write a UI5 application to visualize trace data in diagrams, such as execution path or result distribution. The trace is the basis for rule optimization. We also plan to create an integrated demo showing deployment, execution of rules, tracing, analysis and adjustment.

 

DSAG Topic Day BRFplus/DSM

On April 1, more than 60 German-speaking customers and partners met in St.Leon-Rot for the first DSAG Topic Day BRFplus/NW DSM. DSAG is the German SAP User Group, comparable with ASUG in North America. Tobias Trapp summarized the event in a blog. He was also a major driver of the event. Steffen Pietsch took over the organization. Thanks to them and thanks to further customer presentations by Günther Schepers (Claas) and Gerhard Kregel (Voestalpine), Topic Day was a huge success.

 

Create Rule Based on Risk Violation in Request, Using BRF+ Procedure Calls

Amanjit Singh Bindra provided an excellent technical document with step-by-step instructions on how to use BRFplus in GRC Access Requests. The many screenshots make this a very instructional guide.

 

Business Rules Framework (ABAP BRF+) in Italian

A video by Andrea Olivieri and Fabio di Micco introducing BRFplus in Italian.

About performance in BRFplus/NW DSM

0
0

Some years ago we published a document with some recommendations and performance figures. The content of the document is basically still valid although it needs updates with respect to some details.

 

Continuous improvements

Over the last few years, we have significantly improved the runtime performance. Mostly, customers have been the drivers for the improvements. Looking at real world scenarios has allowed us to better optimize the generated code, since there was a specific challenge and a clear goal. The enormous customer base of BRFplus/NW DSM helps to improve and mature the product.

 

The results are several notes, some of which can be found in my previous blogs:

 

However, this is not the end. I am currently working on deployments of decision service and rules into the HANA database. It is too early to communicate any more about it. I will blog about it when the time has come to reveal the details.

 

How to identify a performance problem?

When asked for a performance analysis, I see some recurring patterns that often can be easily solved:

  1. Use of “bad” expression types
  2. Use of outdated invocation methods
  3. Old versions of BRFplus, missing fixes

 

Very often a simple test can be applied to check whether performance can be improved significantly. Whenever BRFplus loads objects from the database during rule execution, this is a strong indication of a problem. The general principle is that BRFplus should find all relevant information in the generated code. Therefore you can set an external breakpoint in method CONSTRUCTOR of class CL_FDT_MAINTENANCE. Should you stop there, take a look at the stack trace to understand the reason, which often is one of the three issues in the list.

Maintenance_constructor.PNG

 

Usage of “bad” expression types

All BRFplus standard expression types support code generation, with one exception: Dynamic Expressions. Therefore, we recommend avoiding this expression type by any means possible. It is often used because it is convenient and other patterns that allow compiling all the rules into ABAP code are possible. In some cases, additions to BRFplus have been created by SAP application developers or even by customers and partners. Only recently, the BOPF expression type was improved to support code generation. Whenever code cannot be generated, objects have to be loaded from the database which is a very time and memory consuming step. BRFplus has a mechanism to partially generate and jump into interpretation as soon as it is needed. As a rule of thumb, expect a performance penalty of factor 500 and more as soon as an expression needs to be loaded. Loading an object will hit the breakpoint in the constructor method of CL_FDT_MAINTENANCE.

 

Use of outdated process method

When BRFplus was released for the first time with NW 7.0 Enhancement Package 1, only one API was available for execution of a business rules. A typical call is shown in the following code snippet.

bad_call.PNG

The actual call is made with method IF_FDT_FUNCTION->PROCESS (fourth line). It requires a function instance. This instance has to be created before rules execution can take place. Its creation requires doing selects in several database tables. Data object instances are also required. Again, you fill hit the breakpoint in the constructor method of CL_FDT_MAINTENANCE in this case. Instead of using this code, always use the code template generator. In NW 7.0 Enhancement Package 2, you can use report FDT_TEMPLATE_FUNCTION_PROCESS to generate a code template. In newer releases, you can find the feature in the BRFplus workbench on the function screen (requires technical features switched on in the personalization). The code can be copied into your ABAP source. You only have to set variable values. Although the code is not as easy to read, it is tremendously fast in execution. The code template generator will also compose the code dependent on capabilities of your managed systems.

code_template.PNG

 

Old versions of BRFplus, missing fixes

On first use BRFplus generates ABAP code on the fly. If there are problems, the process does not stop and return with error messages but instead uses the interpretation mode as a fallback. Often, such situations occur because of old release versions and missing fixes. You can also reliably detect this problem with the breakpoint as mentioned above. On first generation it is fine to hit the breakpoint. However, BRFplus may want generate code again and again when generation does not complete successfully. In this case you best check the notes or you open a support ticket to have someone help you identify the notes for your system.

 

NetWeaver Decision Service Management

In NW DSM, the previous three sources of errors do not occur because of conceptual differences. In NW DSM, you generated at a specific point in time and you will receive feedback if generation was successful and with reasons when it is not (for example, use of “bad” expression types, missing fixes). The old invocation API is not supported but DSM comes with a new, highly optimized and simplified API. The code generation function will automatically output the best code, also in scenarios in which NW DSM is used.

 

Problematic data access

Finally, there is a recurring pattern of performance problems that is not so easy to eliminate. In many scenarios data has to be retrieved for evaluating business rules. This may happen prior to the invocation of rules or within the business rules with help of procedure call expressions or database lookup expressions. In any case, expect one select single in a buffered database table to consume more time than processing a bunch of rules, formulas, and decision tables (if not huge). Therefore, it is essential to intelligently select multiple records from the database when possible. In NW 7.3 Enhancement Package 1, we have improved the DB Lookup expression to support aggregation and grouping. You may also use ABAP code that calls the rules to do the select and then pass the records as context data to the rules. Alternatively, instead of DB Lookup expressions, a class for data access that internally buffers and optimizes may be created. With call procedure expressions, the data can be retrieved from within BRFplus. In some cases it may also help to run a DB trace to identify selects that do not hit an index. Note that running BRFplus/NW DSM on a HANA DB often does not require optimizations with indexes, reducing the rule modeling effort.

Solving impossible date & time calculations with #BRFplus and a Factory Calendar

0
0

This time of year, Daylight saving is switching on and off around the world, Easter long weekends are giving us a well-earned break, and it’s turning my thoughts to the joys of calendar calculations in business rules.

 

The word 'time' is the most commonly used noun in English.  Quite Interesting Facts http://qi.com/infocloud/time


ID-100172085.jpg

Image courtesy of stockimages / FreeDigitalPhotos.net

 

As much as I love http://www.timeanddate.com (and working in a global company sometimes it feels like I live and breathe by their Timezone converter) there are some date and time questions it just can’t answer.  Never mind differing timezones and the added joys of daylight saving, with so many different countries, and so many different regions (states/counties/cantons) within countries, just working out which days are public holidays at a site is a serious software challenge.  Add to that the differing working weeks which can vary from site to site, plant to plant, location to location, even from oil rig to oil rig.  You might work in a traditional Monday to Friday 9am to 5pm office, but that oil rig or manufacturing plant – they may be working a 24x7 week, a 18x6 week or any other variation thereof.


It’s not just about knowing if the office/site is going to be open on a certain date/time, knowing if a particular day is counted as a regular working day or as a public holiday can be critical to paying the correct penalty rates for staff on the payroll, and that can lead to all sorts of #nofunatall with workers, industrial unions, and ever delightful government bodies such as Taxation authorities.

 

Workers penalised over public holiday confusion - ABC Gippsland Vic - Australian Broadcasting Corporation http://www.abc.net.au/local/stories/2010/12/07/3086983.htm

 

Having a way to handle these date and time variations is a critical challenge to business rules involving dates, such as determining if a service has been performed within the Service Level Agreement, or whether sales dollars can be counted against the current sales quarter, or when the next delivery must fall if it is to reach a factory site during open hours.

 

The good news is SAP solved this problem many years ago… by introducing the Factory Calendar.  But did you know you can also use the Factory Calendar in your SAP NetWeaver Decision Service Management and BRFplus business rules today?

 

Factory calendar maintenance in SAP http://www.learnsaptips.com/2012/08/factory-calendar-maintanance-in-sap.html

 

The Factory calendar was introduced literally decades ago… I’ve found references as far back as 1999 Release R/3 3.0c. It is built into so many applications across SAP it barely gets noticed anymore it’s so much a part of the virtual SAP furniture.  Those of us in the workflow space use it regularly for working out deadlines based on working days.  You can even schedule background jobs based on the factory calendar!

 

Schedule a background job using a factory calendar! http://****************/Tips/ABAP/BackgroundJob/FactoryCalendar.htm

 

So how exactly do you take advantage of factory calendars in business rules? Here’s how…

 

Prerequisites

  1. Create (or maintain) your factory calendar in Transaction SCAL. SAP delivers a bunch of factory calendars but you may want to create your own or at least maintain the public holidays relevant for this year (some public holidays are assigned by government decree at the start of the year).
  2. Create a ABAP Class that implements the interface IF_FDT_APPLICATION_SETTINGS and implement the method GET_CALENDARto return the 2 character factory calendar id you want to use, e.g.
    ev_fcalid = 'AU'. "The list of factory calendar ids can be found in table TFACD
  3. In the CLASS_CONSTRUCTORmethod of the same class, add the following line that will tell BRFplus you want to set the calendar:
    if_fdt_application_settings~gv_get_calendar = abap_true.
  4. Assign that class to your BRFplus rules application as an application exit class.  How to do that is already described here http://scn.sap.com/docs/DOC-4564 - BRFplus Application Exits
  5. Link the application exit class in the Properties tab of your BRFplus application
  6. Create a BRFplus function in your Rules Application and use a formula expression anywhere within it.

 

That’s it .... now all your date and time calculations in this Rules Application apply the factory calendar.

Just one more thing to keep in mind… both Factory Calendars and Timepoints assign the days of the week to a number from 0 to 6 as follows:

Saturday = 0, Sunday = 1, Monday = 2, .... Friday = 6

 

Now I could show you some boring old real life formulas – and we’ll start with a few, but hey this is dates and times so let’s have bit of fun with showing how some of the date and time functions work.  Most of these examples are for dates, but similar built-in formula functions exist for times. And we won’t limit it to just factory calendars…

 

Find out if the office is open on a certain date

Office Is Open = DT_IS_ACTIVE( My Date)

 

If the office is closed, find out when it is next going to be open

Next Office Open Date = DT_NEXT_ACTIVE( My Date)

 

Calculate a deadline based on working days

Deadline Date = DT_ADD_DAYS( Starting Date, Duration In Days)

 

Work out if it's TGIF (Thank God Its Friday!)

It's Friday = IF ((DT_GET_DAY_OF_WEEK( My Date) EQ Friday) true, false)

Here Friday is a Constant with value 6

 

How many working hours left to complete that project task?

Working Hours to Complete Task = DT_DURATION_DIFF_HOURS( Starting Date & Time, Completion Date & Time)

 

How many working days was I on leave

Leave Days Taken = DT_DURATION_DIFF_DAYS( First Day of Leave, Last Day of Leave)

 

A common problem for meetings ... which timezone is my regional manager in?

Regional Manager Timezone Text = DT_GET_USER_TIMEZONE_TEXT( Language )

 

Count down the working weeks until the next SAP d-code conference

You could of course set up the SAP d-code date as a Constant expression

Weeks Until Conference = DT_DURATION_DIFF_WEEKS( Today, 20 October 2014 )

 

One for the time-poor ...Desperately checking for one more week this year

Is a Long Year = DT_HAS_53_WEEKS( Today )

 

One for the romantics ...Checking for a leap year

Proposal Date = DT_IS_LEAP_YEAR( DT_SET_PART_DAYS( DT_GET_CURRENT_DT( ), '01' ) )

 

Taking it to the next level

The previous example shows how although we can use date and time calculations one at a time, but because these are BRFplus formulas we can also combine different formulas to solve more interesting challenges.

 

So when is the last possible working date to transport my configuration to the testing environment?

Testing starts on the 1st of next month and it takes 2 days to go through transport approvals

Last Transport Date = DT_SUBTRACT_DAYS( DT_SET_PART_DAYS( DT_ADD_MONTHS( DT_GET_CURRENT_DT( ), 1 ), '01' ), Duration In Days )

 

Find the first working day of this week

Remember the first day of the week is Sunday in the Factory Calendar

First Working Day This Week = DT_GET_NEXT( DT_ROUND_TO_FIRST( My Date, 'WEE'  ) )

 

Find the last delivery date of this month

Last Delivery Day This Month = DT_GET_PREVIOUS( DT_ROUND_TO_LAST( My Date, 'MON' ) )

 

Find the last selling date of next quarter

Last Selling Date This Quarter = DT_GET_PREVIOUS( DT_ROUND_TO_LAST( DT_ADD_QUARTERS( DT_GET_CURRENT_DT, 1 ), 'QUA' ) )

 

Which is earlier? My colleague's timezone ahead of the system timezone?

Earliest Date & Time = DT_MIN( DT_GET_USER_TIME( ), DT_GET_SYSTEM_TIME( ) )

 

Using multiple factory calendars

Now if you need to use multiple factory calendars or dynamically pass a calendar, don't forget you can also create your own custom formula functions. You'll find instructions here http://scn.sap.com/docs/DOC-4582 - How to create custom BRFplus formula functions.  How hard is it to create your own custom formula function? I put a custom formula together yesterday, after a little bit of research.  It took me about an hour - now not every calculation will be that easy, but once you have worked out your logic it's really not that hard.

 

There are over 90 built-in Date and Time formula functions in BRFplus - with those plus the use of a factory calendar, there is plenty of scope for solving the most impossible-looking date and time calculations.  So rest easy and enjoy your those public holidays!

DSM Trace Visualization – Installation Tips on a Netweaver 7.40

0
0

This blog is intended to be seen as an addendum to the document “SAP NetWeaver Decision Service Management Trace Visualization” published by Wolfgang Schaper when installing the tool on a Netweaver 7.40.

 

I think this tool is a great new addition to the BRF+/DSM functionality which adds a big additional value to the BRF+/DSM ecosystem. Although the description of the installation procedure is quite good I think there are some further pitfalls that I ran into and that I want to share with you. So this blog should be seen as an optional nice-to-know addition to the document (especially to section 6 of the document which focuses on the Netweaver 7.31).

 

System and Software Stack Details

Starting point for my installation was a Netweaver 7.40 with Service Pack 06 with Decision Service Management 1.0 SP03. I used the Gateway approach to make the OData Service available. In contrast to my comment in the article the OData Service is not contained within the service pack 06. I found the recent note 1966337 (BRFplus: Activation of ODATA-Backend for Lean Trace Visualization) which states that the OData service will be officially available not until SP09, but a downport exists for SP07 and SP08. So bad luck or at least more work to do make this tool available in for SP06 and lower

 

Nevertheless SP06 is good starting point for the installation of the tool with respect to the effort that has to be spent to get things going. Trying to install it in a lower SP of Netweaver 7.40 will enforce you to install a lot of notes with a lot of manual actions. For me that was a dead end as some notes could not be implemented due to wrong dependencies.

 

Getting the Coding Done

As described in the document you first have to step through section 2.1 to install the UI5 components. After that you have to go to section 6 to implement the OData Service manually due to the above mentioned lack of availability of the OData service FDT_TRACE even in Netweaver 7.40.
Before you start to implement the classes you should implement the note 1955524 (BRF+: New Lean Trace Corrections 3) as this note includes some enhancements of the type definition in BRF+ (namely an enhancement of the type  CL_FDT_WD_LEAN_TRACE_HELPER=>S_ID_INFO) that is used within the class ZCL_FDT_OD_TRACE_FUNCTION_DATA method GET_DETAILS that will cause an syntax if not available. So one note with no manual tasks in it is not too much additional effort  from my point of view
The source code you can download from the SCN for the classes can then be easily implemented and activated as described in section 6.1. Nevertheless you should also be aware that there is already one note that corrects some issues in the OData classes (2017565 - BRFplus: ODATA back end - initial values) which you might need too but is not contained in the delivered source files.

 

The customizing of the service described in section 6.2 of the document is also straightforward.

 

What is somehow missing is the fact that you also have to adjust the constant value defining the name of the OData service in the UI5 application. Here you have to go to the BSP application you created in section 2.1 namely to the “js”-folder contained in the “Page Fragments” folder. Here open the “helper.js” file and replace the value for the service name (in the original version you will have the coding: var servicename = “FDT_TRACE”; ) with the one you have chosen when customizing the OData service (usually ZFDT_TRACE).

JS_Source.jpg

Now you can start and enjoy the trace visualization service if you have lean traces in your system .

 

 

Some additions to the troubleshooting
  • If nothing gets displayed in the tool, you should check the entries in the table FDT_TRACE_0200. If there are none, nothing will get displayed as no traces exist.
  • If you are playing around on a sandbox system with some local functions (like me) you might just have added the lean trace option when calling the BRF+ function to create them and you will certainly also persist the data. But be aware that there is another point you have to take into account when dealing with lean traces: you have to switch on the versioning of your artefacts in your function otherwise every time you change something in the function or its ingredients the already persisted traces might no longer by displayed. So as a best practice before calling a BRF+ function with lean trace mode execute the lean trace readiness check on the function level
    LTR-Check.jpg
    If some warnings appear stating that the lean trace functionality cannot be used read them carefully and get rid of them



Finally I hope I could help some of you to avoid the same issues I ran into and speed you up with using this cool new feature of DSM/BRF+
 

BRFplus/DSM at coming SAP Inside Track Hamburg

0
0

Last weekend I was asked on Twitter whether I would offer a lecture about Business rules at SAP Inside Track Hamburg next weekend. I am really glad that people are interested in that topic and so I decided to offer a talk about BRFplus and DSM for next-generation business applications. And if you have specific ideas, questions and suggestions I should cover in my task please send them to me perhaps as comment to this blog entry or even better ask them in Hamburg.

What are my ideas for the talk? Of course I would like to give a short introduction but also share and discuss my experience with BRFplus. I will present some best practices and will also ask technical questions.

 

I blogged many times about BRFplus/DSM and in my opinion every ABAPer should have at least basic BRFplus skills since this tool can make SAP Business Suite as well as custom development more transparent, easier to change. Moreover it can lead to standardization and reduction of development costs.

 

But BRFplus can help you to implement really new business applications and I will also present some use cases. Last april at DSAG I offered a talk about process Automation using BRFplus which is now available on Slideshare. I think I can summarize the information of this talk in a few sentences:

 

  • Automation of data driven processes with standardized business messages like purchase orders is easy. But working with data from the real world is difficult. It becomes really interesting when automating processes with semi-structured and highly ambiguous data.
  • Understanding and operating with real world data means rationalisation of business processes and is a chance to implement completely new business processes.
  • At my DSAG talk I presented my experience on processing scanned questionnaires with Information about accidents using sophisticated BRFplus rule sets - perhaps I will be more technical next weekend.

 

I think I will discuss this kind of applications but also how to use BRFplus/DSM at large scale. Those use cases are common in finance and insurance where complex calculations are part of contracts with business partners. These data have to be maintained and replicated to operational systems. BRFplus/DSM can serve as cornerstone for those this kind of applications.

 

If there’s still time I will come to DSM-HANA scenarios which have real outstanding capabilities.

But These are just my first thoughts – if you have other suggestions what to cover in this talk feel free to contact me. And I'm already looking forward to an exciting SAP Inside Track Hamburg.

My learnings in BRF+

0
0

Hi Guys, It has been long time going through tutorials, practicing and creating applications in BRF+ for my clients. I would like to share few of my learnings.


TCODES USED:

     I have come to know that we can access BRF+ with three different t-codes, namely - BRF+, BRFPLUS and FDT_WORKBENCH. This piece information may  be too old but I feel like mentioning it in case anyone doesn’t know it.


DYNAMIC PROGRAMMING:

We can create a BRF+ application from scratch via dynamic programming to assess its performance. We can even add elements, functions, rules and rule sets via dynamic program to already existing application.

             Search programs based on FDT_TUTORIAL*, It shows a list of few dynamic programs.


TABLES USED W.R.T. STORAGE TYPE:

     Applications created in BRF+ can have below storage types. Once an application is created with a storage type then its storage can’t be changed.

 


Storage Type


Client


Transport


Cross-Application Usage


System


client-independent


transportable or local


Can use system objects.


Customizing


client-dependent


transportable or
  local


Can use system and
  customizing objects.


Master Data


client-dependent


local


Can use system,
  customizing, and master data objects.

 


Storage Type


Table Local Objects


Table for Transportable Objects


System


FDT_ADMN_0000S


FDT_ADMN_0001S


Customizing


FDT_ADMN_0000


FDT_ADMN_0001


Master Data


FDT_ADMN_0000A


FDT_ADMN_0001A

NOTE -  I believe I could get the table information right. Please let  me know if I am wrong. I would like to make corrections regarding this.

 

USEFUL PACKAGES AND PROGRAMS:

            Look in the Package ‘SFDT_DEMO_OBJECTS’, SAP has provided examples for almost all possible scenarios in BRF+.


HOW TO CALL BRF+ FROM ABAP:

          I have seen most of the people use and recommend below kind of approach to call a function in a BRF+ application.

________________________________________________________________________________________________

                 CONSTANTS:  c_function_id       type fdt_uuid value ‘XXXXXXXXXXXXXXXXXXXXXXXXXXXXXX’.

 

                DATA:               lo_fuction             type ref to if_fdt_function,

                                           lo_context           type ref to if_fdt_context,

                                           lo_result              type ref to if_fdt_result,

                                           lv_result              type ref to data,

                                           lo_message        type ref to cx_fdt.

             FIELD-SYMBOLS:

                                          <result>               type any.


TRY .

        " Get BRFplus function

      lo_fuction ?= cl_fdt_factory=>if_fdt_factory~get_instance( )->get_function(  C_FUNCTION_ID  ).


      " Set the BRFplus function context ( input variables )

      lo_context = lo_fuction->get_process_context( ).

 

**--Call below line for no of input paramters, iv_name = inpur paramter and ia_value = its value

      lo_context->set_value(   iv_name  = ' ' ia_value = ‘ '  ).                                       

      " Process the BRFplus function, pass input paramter values and fetch the output 

      lo_fuction->process( EXPORTING  io_context   =  lo_context

                                        IMPORTING  eo_result     =  lo_result ).


     " Retrieve the BRFplus function result into field symbol <result>

        create lv_result.  

       assign lv_result->* to <result>.

       lo_result->get_value( IMPORTING ea_value = <result>   ).

 

       CATCH cx_fdt INTO lo_message.     

  ENDTRY.

________________________________________________________________________________________________

 

        Though above code is very easy to understand, many a time one may get confused and lost in the program in a complex scenario. BRF+ provides two  other ways to call a function of an application from ABAP.

 

1. Go to an application -> Go to a Function -> In the Detail tab Click on ‘Create Code Template’ and use the piece of code in ABAP program to call that function.

Create Code Template.jpg

      Slect Show comments and click on Apply as shown below.Click on Apply.jpg


          Code is generated as shown below, which can be used in ABAP program to call function of BRF+ application.

    Code is Generated.jpg

 

2. In the Menu bar -> go to Tools ->Function Module Generation (RFC) -> Select required Function from an application and either create a
Function group with a function module or specify existing Function group and mention a new function module to get created ->Generate.

          Required Function module is generated which will have logic to call required function with input Parameters and output Parameters as defined for the function. This function module can be used to call function of a BRF+ application from anywhere in ABAP.

Select RFC.jpg

         Select the required Function and create a Function module [ Create a function group it one deoesn't exists and check the box create function group ]

and for local object check the local object else mention the required package for functional module as shown below.

Select Function.jpg

   And upon clicking on 'Generate'

  Generate.jpg

And execute the Function module as shown below

execute the FM.jpg

  This Function module has same inputs as in Function and results the output same as Function. This Function module can be used anywhere in ABAP to call the required function 'DETERMINE_STR'.

 

 

CATALOG:

                   Creation of Catalog is very useful for Functional Consultants and Users who maintain Business rules in the Final system. Functions of either same application or different applications can be grouped together in a Catalog.

             Consider an application ZPO_BRF+ which has few functions related to Purchase Order and ZPR_BRF+ which has few functions related to
Purchase Requisition and ZMIGO_BRF+ which has few functions related to Goods Receipt and Goods Issue.

                  
Then we can create a Catalog as Logistics_MM and can add all the functions from the above applications [only if all the applications have level
as Global or related via a hierarchy]. It makes User / Functional Consultant easier to maintain all the business rules for which he is responsible for.


BRFplus in Big Data scenarios

0
0

Hello SCN-Community,

 

this is my first blog here in SCN. May I introduce myself first. I’m Daniel Ridder and I‘m at AOK Systems GmbH, Germany and project manager and lead architect in SAP development projects. In the last time my focus was on business rule management, development of HANA-ready solutions and operational analytics in context of Big Data scenarios.

In this blog I want to give you a brief look into work and experiences I made. In hope you are not getting bored I would split this blog into a few more articles. I guess the following structure would be fine:

 

  1. Introduction  - What is it all about? - „A brief intro to our business we have to deal with“
  2. Rulesets for calculating prices in context of Big Data Scenarios - „BRFplus and our suggestions about working with big decision tables and usage of paralllelism techniques to deal with a huge amount of data“

 

This article deals with both topics. Further topics would be:

 

  1. Explorative search techniques to determine anomalies in invoices – how could we support the end user during his daily work?
  2. Operative analytics with BW in ERP

 

But first things first – hope being able to handle it…

Introduction  - What is it all about?

The Project deals with the processing of medical invoices (ambulatory medical and dental care). Such collective invoices are being sent quarterly from the present medical association to each single statutory health insurance. Each invoice is supported by additional information about diagnostic data and medical procedures especially surgeries.

A single data delivery contains about approximately 7.000.000 invoices supported by an average 120.000.000 medical treatment. Due to that an invoice and it's provided documentations is modeled by an 1:n relationship. Over a year an average health insurance Company deals with 28.000.000 invoices and 480.000.000 medical procedures. This huge amount of data has to be inspected and billing errors have to be detected. Additionally the billing amount of each invoice has to be calculated by the system. To determine a billing error you have to concatenate more than five billing quarters (needed to prove about overlapped bills of an insured person). For now you could imagine what the runtime is going to take.

 

Rulesets for calculating prices in context of Big Data scenarios

After this brief introduction and in face of the described amount of data to process we may talk about Big Data I presume.

One goal is to calculate the amount of each invoice. To do this we'll have determine the price for every medical service. It sounds easy but we‘ll have to do it in each quarter collection for about 120.000.000 medical procedures. After that we could sum the prices and define the cost of an invoice.

The calculation of a price is not being done by a simple look up in a price list. The price of a medical service is defined by a regional contract of billing which is dealt between the health insurance and the present medical association. If a single service not provided in such a contract a general standard contract has to be taken.

So let me summarize our requirements: We have a huge set of medical data that has to be checked according standard and individual contracts that include billing rules. The management of these rules have to be transparent, flexible and operations using those rules have to be scalable and high-performing.

On our first try we chose the BRFplus – and it turned out to be the right decision

With this article I want to share our experience and solutions to the problems we were faced with. In advance I want to thank Carsten Ziegler, Wolfgang Schaper and Tobias Trapp for the provided support.

 

 

 

Regionally billing contracts and raised problems by dealing with them

Slightly simplified, a billing contract is represented by a logical concatenation of expressions – by AND connected. This construction is perfectly represented by the use of a decision table.

DT.JPG

Unfortunately a few problems raised at this point:

The whole billing contract for a quarter year contains about 37.000 rules if all 17 medical associations were included. So we were faced with the following problems:

 

  1. Decision tables of such a size are able to get handles by the BRFplus. But we determined the BRFplus Workbench is no longer the right tool to maintain these rules. We recognized noticeable latencies while browsing in the decision table and during the modification of a single row.


During rule execution the generated code of a decision table the rows consecutively passed line by line from top to bottom. So it can the order of rows becomes important, otherwise two problems will occur:

 

  1. If the current invoice being processed is subject to the rules of the 17th of 17 medical associations we have to validate about 34.000 rows before  we reach the correct section of a decion table.
  2. If the order of the rows in the decision table is arbitrary, it is possible that a  rule for a high frequently supported medical service will be found in the middle or even worse, at the bottom of the decision table which leads to bad processing times.
  3. And last but not least we mentioned an improvement in growth about the loadsize of the generated ABAP class of the BRFplus function. On a yearly basis we'll expect a loadsize about 85MB.

Our chosen solutions

We could solve topic 2 with numerous attempts of creating a suitable maintenance of the decision table. In view of the end user it would be better to separate the quarterly contracts of billing for each medical association. So we provide for each quarter and medical association a single decision table which we could trigger by a gate expression in the corresponding ruleset. All other rules are not be validated in this case.

FN.JPG

RULE.JPG

Unfortunately we gained no performance improvement while administering a single decision table in the BRFplus workbench. Currently such a decision table contains an average of 2.100 rules (we see a variance by 600 to 5.000 rules). Due to that we provide an ABAP report for maintaining the decision tables. The report provides an ALV grid where the end user could define new rules or initially fill the grid by a set of rules stored in a text file (this represents the use case of the maintenance of rules in a spreadsheet application). The tablemodel of the ALV grid is being dynamically defined by the underlying BRFplus decision table.
Upload.JPG
DT_ALV.JPG

With this approach we could solve topic 1 by providing a new frontend for maintenance of rules. Additionally the power user is still able to use the workbench i.e. to redefine or modify the whole ruleset.

Please remark that is approach is quite usual in the BRFplus/DSM world. Since the framework contains an API it is easy to provide a alternative frontend for maintenance of rules. Moreover this you can implement additional checks and integration with other dialogs.


We solved topic 3 by analyzing the underlying data automatically and rearranging the rules. By creating an internal hit list (provided by a SQL aggregation) about the count of each medical service in the invoice collection we were able to sort the billing contracts in die ALV grid and generate the decision table in BRFplus in an optimized way.


The topic left over was the issue with the load size of the generated BRFplus class. By providing four decision tables in one ruleset we examined a load size of the generated ABAB class by approximately 5MB.

LOAD.JPG

If we extrapolate this for a whole year and for all medical associations we would expect a load size round about 85 MB (5 MB x17 medical associations).


Currently the BRFplus does not provide some aesthetically convincing mechanisms to deal with it. We mentioned the separation of each decision table in a separate function which has to be set in function mode. As a result we modularize the rule system into different functions. To call the right one another BRFplus function has to be created. This function evaluates the medical association and billing quarter to choose the right function to call.


Another very nice solution to face the problem would be a function which evaluates a decision table. The decision table contains in its single result column the ability to call another function and to return a whole structure not just a single value (mention the following screenshot). Unfortunately this is currently not an option in BRFplus.

DECFN.png

How to parallelize the calculation of Prices

The calculation of prices was settled by a single processing paradigm. The function context in BRFplus was defined by a single invoice. Our tests showed a negative performance impact when fetching data into BRFplus. Even building a function context with more than one invoice showed a negative impact. Since we are convinced the examined system behavior is related to our own special use case we would not generalise for other use cases.

For parallelizing the workload we chose the FPP in ABAP package BANK_PP_JOBCTRL. With this framework we are able to easily wide spread the workload onto several server groups. I think you could find great articles about FPP here in SCN so there is no need to deal with it in this blog.

 

OSS-Notes

Thanks to the support of Carsten Ziegler the following oss notes are recommended to observe if you want to deal with big decision tables or facing with performance issues:

 

1930741: Performance Improvements in the BRFplus Design Time

1936926: Runtime improvements for decision tables

1940360: Changes to the Application Server Buffering for BRFplus Meta Data Tables

1938697: BRFplus: Remove unnecessary data conversion

1901192: Syntax error in generated code for decision table

1980560: Performance Improvement for Table Operation Expression

 

And we implemented 2022076 (Issues in using reserved namespaces starting with /) to prevent that the rule execution gets into interpretation mode.

Summary

Decision tables are one of the most useful expressions in small and large scale use cases. In large use cases you can use them to code complex rules that are parts of insurance contracts. When you use them for huge decision tables don’t forget to optimize the order of decision tables and think about splitting huge rule systems into different functions since the work load of a single ABAP class is limited.

Moreover, search the OSS for notes that optimize DSM/BRFplus performance. And last but not least: use the API of BRFplus/DSM build a pleasing UI for the user who maintenance the rule system.

 

I hope this first article give you an impression about what we are currently working on and you got some inspiration to work with the BRFplus/DSM especially with decision tables.

Arbitrary expressions in a row of a BRF+ decision table key column

0
0

tl;dr

BRF+ allows only scalar data elements and expressions as columns in decision tables. Conditions on non-scalar values (tables) usually require you to add expression columns with scalar result values. This approach therefore requires changing the decision table settings every time a check for a new value needs to be tested for.

An alternative is to use the boolean constant expression True as a column and to add the table expressions directly into the cells of the decision table.

Business case

We use a SAP workflow solution that can be configured entirely via BRF+ (rule-based Workflow of SAP Master Data Governance). We want to implement the following rules:

  • If data governed by a certain department (e.g. controlling) was changed, request approval from that department
  • Else skip the approval

The list of content owner departments (more than 10) is known, but can of course change over time. There is one approval path in the workflow for each of the departments. While on the path, a BRF+ rule is queried to determine whether the actual approval is required or not.

In a request process, an arbitrary number of data fields can be changed. Each data field is assigned to exactly one group, but there is no restriction on the number of field groups in a change request.

Implementations

The implementations shown are reduced to the decision table. Instead of outputting the values needed to steer the workflow, we just use some log actions to showcase the idea.

Standard approach

In the standard approach, we create expressions with scalar return values (here: Boolean). Each expression searches for the occurrence of a given field group name in the list of changed field groups.

The resulting decision table schema looks like this:

wide_table_settings.PNG

Now we can fill the decision table like this:

wide_table_rows.PNG

While this solution is working, adding a new field group requires changing the table schema, a change that requires re-testing of the table, since this change affects the complete decision table. Also, the table gets harder to maintain as we add more values to check for, as each value requires a new column.

Expression in row

An alternative implementation inverts the notion of value and expression by placing the constant expression True in a column and the expressions into the rows of the table.

The table schema is depicted in the following image. Note that you can get the constant True by clicking the "include FDT standard application" check-box in the context search dialog:

narrow_table_settings.PNG

In the table, add the expressions by choosing Select... from the context menu of the table cell. We can re-use the expressions created for the first approach here, as long as those expressions return a boolean value:

narrow_table_rows.PNG

Remarks

  • The condition on the Amount value was added only for instructional purposes, to show that you can place any condition into the table cell as long as its return value is of type boolean. Since Amount is already a scalar value in BRF+, you should rather use it in a column of its own, since this form is much less expressive than the original range expression on a dedicated column.
  • You cannot use a range expression on Amount directly in the cell, despite that range expressions also have a boolean result type. Trying to do so gives an error message about "True is not used as input". To work around this issue, use a boolean expression that contains the range.

Conclusion

I hope this tip is something new & useful for the community. I have found a ton of interesting material in the forums and blogs already that helped me in my project, and I hope I can give something back this way. BRF+ is a fun & exciting tool to work with, and I want to thank Carsten Ziegler for writing & constantly improving it.

August 2014: Reader’s Digest for SAP Decision Service Management (DSM) and Business Rule Framework plus (BRFplus)

0
0

Once again it’s time for a new blog about the news and contributions in the space of SAP Decision Service Management (DSM) and Business Rule Framework plus (BRFplus).



Solving impossible date & time calculations with #BRFplus and a Factory Calendar

New mentorJocelyn Dart needed to calculate dates and times using a factory calendar. No problem! She documented how date and time calculations work with some examples. She also explained how a factory calendar can be used.

 

 

About performance in BRFplus/NW DSM

After doing some performance analysis in a customer system, I took the opportunity to write a blog about it. In the blog, you will find important notes, a simple check to identify a common performance problem, and other tips.

 

 

TechEd 2013 Hands-On Exercises

Wolfgang Schaper and Regina Weidemann have recorded a video that shows the TechEd 2013 exercises and solutions for the session CD160: SAP NetWeaver Decision Service Management – A Paradigm Shift (Hands-on). The video is very explicit and instructional.

 

 

BRFplus in Big Data scenarios

This is the premiere blog of Daniel Ridder from AOK Systems and it’s not a bad one at all. We had some working sessions on how to best use BRFplus and how to further improve it for their scenario.  It’s very kind of him to share his experience in the blog.

 

 

SAP TM Enhancement Guide

SAP Transportation Management is a product that has nicely integrated BRFplus. Credits go to Bernd Dittrich and his team. In the guide, starting at 4.3, you can see how it’s done. I know of several customers that frequently use BRFplus rules in SAP TM.

 

 

Usage of BRF+ tool in BW for a complex rule

Thanks Sergio Locatelli for posting this blog. Using DSM/BRFplus in combination with Business Warehouse is a common pattern that had not yet been described.

 

 

Arbitrary expressions in a row of a BRF+ decision table key column

Jürgen Lukasczyk uses examples to explain how expressions can be nested.

 

 

BRFPLUS in SAP CRM WEBUI Framework

Rajwin Singh Sood from Atos contributed a nice document with a step-by-step illustration of how business rules are created and used in the business logic of the SAP CRM WEBUI framework.

 

 

DSM Trace Visualization – Installation Tips on a Netweaver 7.40

Christian Lechner has already shown a very high level of expertise commenting on questions in the forum. I am happy to see that he is also blogging on SCN now. In the blog, he provides some additional information about the trace visualization as published here.

 

 

My learnings in BRF+

This blog is very different from most of the other blogs. 

Siva rama Krishna Pabbraju shares his new knowledge of BRFplus.

 

 

BRF Plus Training (Business Rules Framework Demo)

Anubhav Oberoy has uploaded many trainings on YouTube; this time about BRFplus. He has a very unconventional style of explaining things while drawing and typing in PowerPoint or creating objects in the system.

 

 

--- German only ---

Newsletter der AOK Systems über BRFplus

AOK Systems uses BRFplus in many processes. Find out more about it here.

 

 

Geschäftsregeln effizient in SAP-Systemen umsetzen - BRFplus und SAP Decision Service Management

There is a German webinar by Lukas Bretschneider on September 19, 2014. The webinar is conducted in German. Please contact me if you would like to attend a webinar in English.

 

 

Workshop Variantenkonfiguration

For several years now I have been in contact with Patrick Müller from epsFlow. epsFlow is a Walldorf-based consulting company focusing on variant configuration. They also support customers that want to combine variant configuration with BRFplus. I am also in touch with other consultants and maybe you will hear more about the combination of variant configuration and DSM/BRFplus soon.

German Web Seminar: BRFplus and DSM / September 16

0
0

Working in Technology Consulting at SAP, it is quite cool to see how many cool tools we have at SAP. And some of them are just built into our NetWeaver Standard.

Unfortunately, some of these tools are not very well known by our customers (or even colleagues, partners, etc.). One of the tools I am working with is BRFplus and DSM (SAP Decision Service Management). In different projects we see the value of modelling logic into business rules instead of hard coding it into ABAP (or customizing tables)

 

To publish some pieces of information about this tooling, we plan a web based session on September 16 from 10.00 am (CEST) to 10.20 am.

As you can see, this is quite a tough schedule, to get into BRFplus and even the differences / additional values / features of DSM in 20 minutes - but I guess this is why it is called "Espresso".

 

If you are interested in BRFplus / DSM or if you know someone who might benefit from this session and if the German language is not an issue, just feel free to register for this session here.

How to Download & Upload Decision Table from Rules Manager in CE 7.3

0
0

This guide provides instructions on how to download & upload Decision Table from Rules Manager in Excel format.

 

Note: This Document Holds good for all CE 7.3 SPXX.

 

Step 1: Log in to rules manager using the url http://<Server>:<port>/rulesmanager

1.png

Note : Use the user to log in which has the following roles assigned to it: SAP_BRM_ADMIN & this role has full rights asigned for the following project.

 

Step 2: Select the Rules Projects , click on open project and go select Active Version:

 

1.png

 

Step 3: Select the "RuleSet" for which the version is to be checked.

1.png


Step 4: Expand the "Decision Table" on clicking on the1.png . Select the "Decision Table".

1.png


Download or Export "Decision Table" Values


Step 1 : To Download or export "Decision Table" Values Click on the "Export" Button.

1.png

Step 2 : A popup will open up as shown below. Click on save to download or export "Decision Table" Values.

1.png

Step 3 : Save it and open it. You can see the "Decision Table" Values.


Upload or Import "Decision Table" Values


Step 1 : To Upload or Import "Decision Table" Values Click on the "Import" Button.

1.png

Step 2 : A popup will open up as shown below. Click on "Browse" button first to Select the Excel you wish to upload to the "Decision Table".

1.png


Step 3 : Click on "Upload" Button.

1.png


Step 4 : Enter Comments you wish to enter as shown below. Select the "Activate Changes" Check box as shown below. Finally Click on "Submit Changes" button to complete the Upload or Import process.

1.png

To Check the changes has reflected or not you can check the version.


Note: To know how to check the version Click Here

SAP TechEd && d-code 2014: Session Schedule for SAP Decision Service Management

0
0

Fall means TechEd time for me. This year there are some changes.

  1. TechEd is now called “SAP TechEd && d-code”. Fine, I can live with that. 
  2. This year I will only do the Las Vegas sessions. Wolfgang Schaper from product management will take over sessions in Berlin for EMEA TechEd. Some of you probably know him from previous years and he also produced some popular SCN content. Anbusivam S from the support team will be the speaker for the Bangalore TechEd.
  3. Unfortunately, this year there will not be hands-on sessions. Competition is intense. Unfortunately, I could not secure the hours for us this year.

In the following you can see the preliminary session schedule for all locations.


TEC839: Road Map Q&A: SAP Decision Service Management

Overview of the new and innovative things planned in the short, medium, and longer horizon for SAP Decision Service Management. New features are demoed as available. Ask your questions, share your experiences, and place your requests to us.

 

  • Las Vegas, Tuesday October 21, 6:00 pm – 7:00 pm, Q&A 1 Show Floor
  • Las Vegas, Thursday October 23, 12:00 am – 1:00 pm, Q&A 3 Show Floor
  • Berlin, Wednesday November 12, 2:00 pm - 3:00 pm, PR 4.2 in Hall 4.2
  • Berlin,  Thursday November 13, 3:00 pm - 4:00 pm, PR 4.4 in Hall 4.2
  • Bangalore, Q1, 2015
  • Bangalore, Q1, 2015

 

TEC204: Business Rules and Operational Decision Management with SAP

Learn about SAPs rule technology portfolio and strategy. Including SAP HANA-native capability to define and execute business rules and ABAP-based decision service management. See demos from customer use cases that underpin the offered capabilities and rules that leverage the power of SAP HANA.

 

  • Las Vegas, Wednesday October 22, 11:45 am-12:45 pm, Palazzo Ballroom M
  • Las Vegas, Thursday October 23, 3:15 pm-4:15 pm, Delfino 4104
  • Berlin, Wednesday November 12, 12:15 pm - 01:15 pm, L10, City Cube L1 - A8
  • Bangalore, Q1, 2015

 

EXP17664/EXP17632: Business Rules Management in ABAP and SAP HANA Applications

SAP Decision Service Management software provides powerful business rules management for applications written in the ABAP programming language. Learn how you can also use SAP Decision Service Management to push decision services and business rules into SAP HANA applications.

 

  • Las Vegas, Wednesday October 22, 3:00 pm-3:30 pm, Show Floor Lounge 9
  • Berlin, Wednesday November 12, 5:00 pm - 5:30 pm, Expert Lounge EL 4.3, Hall 4.2
  • Bangalore Q1, 2015

 

EXP17665/EXP17634: Combining Business Rules and Predictive Analytics in ABAP Applications

Learn how to use business rules and predictive analytics in applications written in the ABAP programming language to support more powerful yet flexible business processes. This session features a demonstration of the SAP InfiniteInsight solution and SAP Decision Service Management software to show how the automation of operational business decisions can be made easier and more powerful.

 

  • Las Vegas, Wednesday October 22, 3:30 pm-4:00 pm, Show Floor Lounge 9
  • Berlin,Wednesday Nvember 12, 4:00 pm - 4:30 pm, Expert Lounge EL 4.3, Hall 4.2
  • Bangalore: Q1, 2015

A quick look at DSM HANA Expressions using Dynamic Database View

0
0

In this blog entry I would like to discuss some experience with decision tables created by DSM that are running on HANA. In my opinion it is one the most exciting technologies so far:

  • Business rules are far superiour to  customizing and makes SAP Business Suite and custom development more transparent and easier to change.
  • BRFplus (and also HANA) has the concept of decision tables which is one of the most useful expression type when using business rules. They are easy to use and also business experts can understand them.
  • With DSM you can push business rules down to HANA and I think this has enormous potential. One killer application is simulation of rule changes. So far this is a very complex task where business experts und technical consultants have to work hand in hand and they need system landscapes for tests. Wouldn’t life be much easier if this simulation possibility would be supported directly by your SAP application running on HANA?

In this blog I’ll introduce some concepts, show a basic but somewhat realistic example and discuss some best practices and last but not least possible improvements for coming DSM versions.

 

Welcome DSM!

If you want to use HANA expressions in BRFplus you habe use the AddOn Decision Service Management called DSM. For the ones who don’t know the difference between BRFplus and DSM let me try an explanation. BRFplus is a technology of the SAP NetWeaver platform. Technically DSM is an AddOn to BRFplus that is seamlessly integrated into BRFplus - so BRFplus is not disrupted by DSM. The opposite is true: DSM completes BRFplus by introducing a number of features that are necessary for the management of business rules, f.e.:

  • better deployment mechanisms for rules
  • new tools for testing and debugging
  • better governance: you can centralize the development of rules systems and also introduce additional metadata that help you for migration and maintenance of rule systems

 

In fact the DSM strategy goes even further and there also partner solutions that help you to deploy business rules to non-SAP systems. But all those features are beyond the scope this blog and instead I will discuss only the feature of pushing decision tables down to HANA. You may ask why are business rules in HANA important? Usually decision services implemented in BRFplus/DSM are called from ABAP for a single line item and return calculated values for single item. This is useful but with HANA we can do more:

  • In Big Data scenarios code pushdown to SAP HANA is reasonable. So decision services should be deployable to HANA.
  • Even in classic scenarios with data of moderate size the use of HANA is still promising since it allows you to simulate the effects of changes of rules systems. With HANA you can calculate the effects of changed rule systems on the fly and can analyze the results.

The last aspect is most important since it makes SAP Business Suite transparent and easier to change and you can react faster if your business rules and practices when you have to react on competition, compliance, legal requirements and more.

 

In an ideal (but not always realistic) architecture we would use the same decision table both for decision services on single business objects but also for mass operations on HANA database model. And in there are good news: DSM is HANA-ready! And this is how it goes: DSM has a new object type called Dynamic Database View and the view connects a data source to a decision table and the result is a HANA artifact which can be transformed into a transportable HANA artifact afterwards. So you still have to deal with transport of HANA artifacts which is still a little bit unhandy but if you built HANA side-by-side scenarios you will know everything you’ll have to know.

 

So DDBV uses HANA decision tables which are well known. For the ones who have no experience with this technology I will explain in a few word: There are three possible ways HANA decision tables can operate:

  • They rely on a database table and can change the data.
  • The second option (and will discuss this), creates a calculation view having a result column that is calculated.
  • Another option is that the view relies on an Analytical or Attribute View and the decision table performs a projection: a certain row can be in- or excluded from the result set.

 

By combing the data model (f.e. a HANA Calculation View) with a BRFplus decision table you can use the object in DB lookup expression in a BRFplus application.

 

A simple but realistic example

Wolfgang Schaper explained the steps for creating a decision table and in his Whitepaper he chose a database table but usually this won’t work since the data is usually scattered across many different tables – perhaps even customizing tables are involved. This makes it even more difficult since often you have to avoid hard coded constants. So what should you do? In general you will have no choice and create HANA Calculation Views. So if you are lucky and your developers have HANA skills then this will be an easy task.

 

So let’s look at an example which is a typical and somewhat realistic use case since different tables and customizing tables are involved:

  • I have table ZITEM and ZPOSITION and both are in a one to many relationship.
  • ZITEM has an association to the central business data and in my decision table  would like to use the birth date from transparent table BUT000.
  • The elements of ZPOSITION have a classification attribute that should be used in my decion table and can contain values like VERY LOW, LOW, MIDDLE, HIGH, VERY HIGH.
  • Moreover I don’t want to hard code these values since they rely on customizing (transparent table ZCLASSIFICATIONS).

 

To use a decision table in this scenario I have to transpose this item/position relationship ino a single table:

  • ITEM – for each entry of table ZITEM
  • BUYER – a Business partner number. This would be only useful in my decision table if I would like to define special rule for some partners
  • BIRTHDATUM – one attribute of BUYER form transparent table BUT000
  • CLASSIFCATION1 – corresponding to a value CLASSIFCATION1 in a customizing table ZCLASSIFICATIONS
  • CLASSIFCATION2 – corresponding to a value CLASSIFCATION2 in a customizing table ZCLASSIFICATIONS
  • CLASSIFCATION3 – corresponding to a value CLASSIFCATION3 in a customizing table ZCLASSIFICATIONS

 

The values CLASSIFCATION1, CLASSIFCATION2 and CLASSIFCATION3 should be not equal NULL if for an item there exists a row in the ZPOSITION table having that value.

 

This will be more clear when you see an example. These are the items, the first item is associated to a Business Partner:

zitem.JPG

Each item has a certain number of positions, each has a classification:

zposition.JPG

The classifications in my scenario are defined in a customizing table. Only those classifications are relevant in my scenario – the other ones can be omitted here:

ZCLASSIFICATIONS.JPG

When I join everything together I get the following (flat) result. Please remark that some values (depending on above customizing table) should be take from the positions to the items:

zjoin.JPG

Now I can define a decision table that works on each item and defines a decision service that calculates output results. But therefore I have to join all those data in the described manner together. One possibility is to create a HANA view that performs the join.


And this is exactly what you have to do when using DDBV: You may a Calculation View in your HANA Studio to perform some SELECTs and JOINs. You see the view such a view here:

hanaview.JPG

I’ll come back to the SQL statement used in the view later but at first I want to complete the workflow for creating a decison table based on a HANA view.

 

The Design of a HANA Decision Table

To access HANA data there is a new object type called dynamic database view (DDBV). With this object you can consume a table or an HANA view. Therefore I define DDBV for a calculation view and select database fields as result fields like shown below:

c2.JPG

As result data I define a new calculated field called CLASSIFICATION which is calculated in a decision table POSITIONS_DT:

c3.JPG

This decision table can be defined arbitrarily, f.e. as follows.

c4.JPG

As result HANA decision table is generated from the BRFplus decision table having the HANA Calculation View as datasource:

c1.JPG

In fact you can also define different input parameters for the decison table that can be used in the column expression but are not passed through into the view.

 

So let me summarize:

  • The DDBV object is the link to a HANA artifact (f.e. a Calculation View) that is on a remote HANA DB (think of a schema of BW on HANA system  perhaps containing data even from non-SAP systems) or ABAP on HANA Systems (of course having HANA as primary persistence).
  • The decision table working on the view is optional. It is generated from a BRFplus decision table but the expressiveness is limited since you can’t use all
    expressions (think of custom defined ABAP formulas for example). Unfortunately the supported features doesn’t seem to be documented so you have to try it out.
  • Usually the data are consumed using an ordinary DB Lookup expression in BRFplus. But since you know the HANA artifact you can call them directly using ADBC for example or tie them to a ALV on HANA grid for further analysis.

The Main Challenge: View Building

As I told before, decision tables work on relational data and due to normalization they are spread over many database tables. So the main challenge is view building to create the input data for a decision table.

 

Since DDBV supports HANA artifacts (working on a single ABAP table is often not realistic) HANA development is necessary and so are HANA artifacts that can be transported using CTS+. So you can start implementing this scenario but from my point of view it is not that comfortable as it could be. But here my opinion is very simple: If you can solve pain points with HANA development then you should definitely do this and start creating HANA views.

 

For the sake of completeness I will mention another option: in scenarios where BW and HANA is involved you can generate HANA views from DSO: this is described in the HANA development guide. Please remark that this is possible when you are using an Enterprise BW but it should work also when working with an Embedded BW on operational data.

SAP’s “Secret” Weapon: Core Data Services

Since decision management is about making decisions based on operational data IMHO a plain ABAP approach would be best. Above mention DSO should be avoided unless you don’t have good reasons to do.

 

In my opinion a future Version of DSM should support ABAP Database Views and these could extended them by result columns since this lead to pure ABAP based programming model, But wait, didn’t I wrote that classical database view are not useful. Yes, but this is only half of the truth since NW 7.40 ABAP supports new view building possibilities using ABAP Core Data Services. The advantage of this approach is that it is seamlessly integrated into ABAP Workbench and you don’t need care about the clumsy HANA transport mechanisms like HANA Transport Container. Unfortunately we can’t define calculation views in CDS (=extension of normal views) otherwise it would be very easy. At first you define a join for each relevant classification:

joinedpos_ddl.JPG

Since multiple entries can occur we select only distinct ones:

joinedclass_ddl.JPG

Then we join everything together:

joinedcust.JPG

An SQL expert will immediately recognize possibilities for simplification and optimization. I show this example nevertheless since it was the only possibility to solve above described join in NW 7.40 SP4. In SP5 and later we have completely new possibilities and the optimization of above CDS views is “left to the reader as easy exercise”.


This is what we can learn the following from example:

  • Pushdown of a decision table to HANA needs view building which should  be done using CDS since CDS is superior to classic ABAP Dictionary DB Views.
  • If there would be a way extend above view by result columns in CDS then it could easily linked to DSM.
  • IMHO ABAP CDS views should get parameters and we should use them not only in a decision table but also pass them to CDS for the reasons of  flexibility. Moreover it should be possible to pass parameters from DSM to CDS.
  • We should start to think whether the update after execution of a decision table should be performed on the application server. Perhaps it would also good to do it using a database procedure.
  • For more complex calculations we should start to think about nested decision tables but as far as I know this currently out of scope of HANA decision tables.


Unfortunately DSM doesn’t support those techniques at the moment and so HANA development will be necessary in most real-word scenarios.

 

Summary

So far DSM’s decision tables built on Dynamic Database Views are a little bit limited. Of course they do their job but together with advanced ABAP view buildung the implementation would be much easier. On the other hand the possibility of using arbitrary HANA Views in DSM is also a chance since you can also access data from non-SAP systems and even consume BW objects suing BRFplus which is useful especially in BW on HANA scenarios. If you need further information about this scenario you should look here and here.


Another interesting aspect is that obviously the HANA-power of DSM is related to the features of the HANA database. So far decision tables are the only expression that can be pushed to down to HANA but it is obvious that every extension of this mechanism will make DSM much more valuable – think of nested decision tables for example. Here I expect more to come and this is why I already looking forward to SAP TechEd && d-code especially

SAP TechEd && d-code 2014 session about SAP Decision Service Management.

 

So my impression is: DSM made a good start on its HANA road. Now SAP has improved its infrastucture (especially HRF and CDS) and therefore synergies should be possible. So I think the DSM HANA story will continue and I am eagerly waiting for further annoucements.


And last but not least I recommend every ABAP developer to inform about latest ABAP features (SAP Inside Track Munich this weekend is a perfect chance) and to fresh up his SQL skills since the invention of means a reinvention of the ABAP Dictionary and extension of the SQL capabilities of the AS ABAP.

 

Appendix

In the case you are interested I show you the SQL statements generated from above shown CDS views. You can use them directly to build HANA views using SQL:

CREATEVIEW"ZMYSCHEMA"."ZVJOINED_POS" ( "MANDT",
        "ITEM",
        "BUYER",
        "BIRTHDATUM",
        "CLASSIFICATION1",
        "CLASSIFICATION2",
        "CLASSIFICATION3" ) ASSELECT
        ITEM."CLIENT"AS"MANDT",
        ITEM."ITEM"AS"ITEM",
        ITEM."BUYER"AS"BUYER",
        BP."BIRTHDT"AS"BIRTHDATUM",
        C1."CLASSIFICATION1"AS"CLASSIFICATION1",
        C2."CLASSIFICATION2"AS"CLASSIFICATION2",
        C3."CLASSIFICATION3"AS"CLASSIFICATION3"
FROM ( ( ( "SAPDAT"."ZITEM" ITEM

      LEFTOUTERJOIN"SAPDAT"."ZVCLSFDDPOS1" C1 ON ITEM."CLIENT" = C1."MANDT"

            AND C1."ITEM" = ITEM."ITEM" )

      LEFTOUTERJOIN"SAPDAT"."ZVCLSFDDPOS2" C2 ON ITEM."CLIENT" = C2."MANDT"

            AND C2."ITEM" = ITEM."ITEM" )

      LEFTOUTERJOIN"SAPDAT"."ZVCLSFDDPOS3" C3 ON ITEM."CLIENT" = C3."MANDT"

              AND C3."ITEM" = ITEM."ITEM" )

      LEFTOUTERJOIN"SAPDAT"."BUT000" BP ON ITEM."CLIENT" = BP."CLIENT"

            AND BP."PARTNER" = ITEM."BUYER"WITH READ ONLY


Please remark that ZVCLSFDDPOS1, ZVCLSFDDPOS2and ZVCLSFDDPOS3 resp. are defined as follows:


CREATEVIEW"SAPDAT"."ZVCLSFDDPOS1" ( "MANDT",
        "ITEM",
        "CLASSIFICATION1" ) ASSELECT
        DISTINCT ZVCLSFD_POS1."MANDT"AS"MANDT",
        ZVCLSFD_POS1."ITEM",
        ZVCLSFD_POS1."CLASSIFICATION1"
FROM"ZVCLSFD_POS1"WITH READ ONLY


CREATEVIEW"SAPDAT"."ZVCLSFD_POS1" ( "MANDT",

        "ITEM",

        "PSTN",
        "CLASSIFICATION1" ) ASSELECT POSITION."CLIENT"AS"MANDT",

    POSITION."ITEM"AS"ITEM",

        POSITION."POSTN"AS"PSTN",

        POSITION."CLASSIFICATION"AS"CLASSIFICATION1"

FROM"ZPOSITION"POSITION

INNERJOIN"ZCLASSIFICATIONS" CLASSIFICATION ONPOSITION."CLIENT" = CLASSIFICATION."CLIENT"

ANDPOSITION."CLASSIFICATION" = CLASSIFICATION."CLASSIFICATION1"WITH READ ONLY


Please remark that CDS not only restricted to HANA but also available for any DB. But don't forget: they work in ABAP NW 7.40
SP4 – so in later SPs CDS is much more powerful. And for really challenging scenarios I think HANA optimized SQL is still the best solution.


Usage of BRFplus in BW Transformations – Quick Wins and Challenges

0
0

In my last blog I discussed the Dynamic Database Expression in DSM. This expression gives you the possibility to access data from HANA database, AS ABAP on HANA and so especially BW on HANA systems. Please remark that every AS ABAP contains a local BW that you can use as Embedded BW. Using an Embedded BE you can perform operational analytics with the same technologies you may know from the Enterprise BW world. In this blog entry I’ll discuss how to use BRFplus in an BW world: Enterprise BW as well an Embedded BW.

 

In BW there are so called transformation which can contain ABAP implementations in so called expert routines or better rule routines or formula BAdIs. A transformation performs a so called data Transfer process from a Data Sourse/PSA or InfoProvider to an InfoProvider. The process is described in SAP Library and I "steal" the following picture from there:

trafo.JPG

Let me also cite SAP Library:

 

A transformation consists of at least one transformation rule. Various rule types, transformation types, and routine types are available. These allow you to create very simple to highly complex transformations:

  • Transformation rules: Transformation rules map any number of source fields to at least one target field. You can use different rules types for this.
  • Rule type: A rule type is a specific operation that is applied to the relevant fields using a transformation rule. For more information, see Rule Type.
  • Transformation type: The transformation type determines how data is written into the fields of the target. For more information, see Aggregation
    Type
    .
  • Rule group: A rule group is a group of transformation rules. Rule groups allow you to combine various rules. For more information, see Rule
    Group
    .
  • Routine: You use routines to implement complex transformation rules yourself. Routines are available as a rule type. There are also routine types that you can use to implement additional transformations. For more information, see Routines in the Transformation.

 

When you use BW transformations with ABAP coded rules you can get the usual benefits of BRFplus:

  • The business logic is easier to understand compared to ABAP code.
  • BRFplus logic is extensible.

    But one question remains: in which cases you can immediately benefit from using BRFplus? The answer is very simple: usually a transformation gets a data package as Input (i.e. an internal table) and returns a internal table with new, calculated values. So if you can loop the input data and calculate the resulting data line by line you can directly benefit from using BRFplus by calling a function.

     

    The situtation gets more difficult when you have to enrich the data. Therefore you usually need data f.e. from DSOs that are selected using SELECT … FOR ALL ENTRIES due to performance reasons – don’t forget we are working with packages of data. Moreover in many use cases we table and loop operations to calculate the results.

     

    In this case we have the problem that the DB Lookup expression of BRFplus doesn’t support the FOR ALL ENTRIES statement. Another but not painful drawback is that DB Lookup doesn’t know DSO object but usually can use method cl_rsd_odso=>get_tablnm to get the name of the corresponding transparent table. Fortunately BRFplus can show the texts for DSO’s generated transparent tables.

     

    Please be also aware that BRFplus code is fast but usually not that fast to hand coded ABAP. The reason is that ABAP offers many features for internal tables like secondary indexes and with 7.40 SP8 more is to come. But I don’t consider this as a reason against BRFplus – only in time critical and high volume scenarios. I am well aware of the fact that developers love to theorize about performance. When I was I student I was a developer in a mathematical research institute and there have been discussions about C vs. C++. Today I am tired about such discussions and only wan to cite Donald Knuth: “We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil". Of course there are use cases that must be fast but business logic must be comprehensible and flexible first.

     

    To summarize my point of view: you can use BRFplus in BW scenarios. In later discussions of Twitter I was told to mention the drawbacks of this approach:

    I can also emphasize this: you should use this approach only in scenarios where it is necessary and makes sense. Reckless usage of ABAP code in BW artifacts is always critical.

    The only challenge using BRFplus are use cases where you read data from other DSOs and here is FOR ALL ENTRIES missing in BRFplus which is painful. Of course you can call ABAP code from BRFplus but this reduces the value of a rule system in my opinion.

     

    A new DB Lookup Expression would be useful

    There are many ways how BRFplus and BW and BW can interact. For using BRFplus “inside” BW there are many scenarios but an enhanced DB Lookup expression would be very useful in these scenarios. Since the database gets more and more in focus in ABAP programming and Open SQL gets extended in NW 7.40 it is time to introduce a new DB Lookup especially for this release. This would help not only in BW scenarios but also upvalue BRFplus in general.

    Handling of Applications with the BRF+ API (Part 1)

    0
    0

    Introduction

    Many questions and descriptions here in SCN focus on the usage of the BRF+ via the workbench as interface for the creation and maintenance of objects of the BRF+. But there is also another way to get these things done namely the BRF+ API. This blog shall introduce you into some basic usage scenarios of the API. A comprehensive description of the API can be found in the bookBRFplus – The Book of Carsten Ziegler and Thomas Albrecht.

     

    The first question that might arise up to now is: Why should I use an API if I have the comfort of the BRF+ workbench with its quite usable Webdynpro UI?

    The answer is: There are scenarios where the BRF+ workbench cannot be used for the maintenance. One example is the maintenance of the rules on the productive environment when DSM is not in place, In case you are using BRF+ applications of storage type S or C your system settings will usually prevent their maintenance on the productive system. There is a workaround by using an application exit but then you really have to take care on the synchronization process between the development and the productive system as those applications are still part of the regular transport process. This can get quite tricky and is not the best way to go.

    Now you might say that a master data application might solve the issue but that one is not intended to be transported except for the XML export and import functionality which might not be the preferred way of your IT operations team. So this might be right time to switch to a generative approach using the BRF+ API. This approach might also be of favor when you want to (deeply) integrate an artefact of the BRF+ (e. g. a decision table) into an existing application so that the user is not even aware of using the BRF+ but on the other hand still being able to use all the nice features of the artefact.

     

    Before the story gets a little bit too theoretical I want to introduce a business scenario where this approach can be used.

     

    The Business Scenario

    As my background is the field of insurance so is my scenario: Let us assume we have a SAP module (like SAP Policy Management) that allows you to manage master policies more specific the fleet master policies. Within these master policies the insurer and the insured company agree on the insurance of the company cars. Usually the insured company pays a certain premium per car and the basis for the premium calculation is written down in the master policy. This basis depends on certain properties of the cars to be insured. Let us use the following very simplified example for such a table:

     

    Weight of car in tonsPerformance of the car in kWPremium amount in EUR
    <= 1.5<= 80250
    <= 2.5<= 90350
    <= 3.5<= 110400
    >3.5> 110500

     

    Each insured car finally gets one single policy issued with the premium corresponding to the properties of car and the agreement in the fleet master policy e. g. for a car with less than 1.5 tons and a performance of 80 kw you pay a premium of 250 €. Even this simplified description leads us to the point that a decision table might be a very good choice to tackle the scenario. But here come the problems:

    • The premium conditions are entered on the productive environment after the negotiation between the insurer and the insured company has been concluded.
    • No Decision Service Management is available.
    • The user does not want to switch from the master policy management application to the BRF+ application in order to enter the data of the master policy. He wants to have one single transaction.

     

    So a possible the solution for that scenario consists of two tasks:

    • Enhance the UI of the policy management application in order to allow the user to display and edit the data in the decision table. For this topic I want to refer to the corresponding document available here in SCN:http://scn.sap.com/docs/DOC-4578
    • Implement a functionality in the system that creates a decision table (including its "surroundings" like application, function etc.) in background for each master policy and make it available for the UI. This task also contains the creation of the functionality to maintain the data in the decision table (plus the nice features like check for overlaps). This is the part that the blog focuses on in the subsequent sections.

     

    Disclaimer

    Before starting with the technical details just a short disclaimer: the usage of the API for the maintenance of the BRF+ objects is some quite advanced topic so if you decide to use it you should know what you are doing (which is in most cases a quite good approach ). Especially when you are integrating the functionality into an existing application you have to be careful e. g. about the commit handling, the transaction handling etc. of the application and the BRF+ API and the interplay between them. Also intensive testing of the integrated components plays an important role.

    In addition the coding snippets shown below do not have a sophisticated error handling so for a real usage some adoptions have to be made.

     

    But now enough for that part and let us start with the real stuff ... the coding with the BRF+ API!

     

    Solution

    In order to make the decision table available for the enhanced UI of the master policy management we will implement a handler class ZCL_BRFPLUSAPI_FLEET for the single operations that have to be triggered via the BRF+ API. This means that as a rough guide we have the following methods for the:

    • creation of the BRF+ application with a function (in functional mode) and a decision table assigned to the function
    • read the decision table data
    • insert and update data in the decision table
    • delete the complete application
    • service functionalities like check for gaps or overlaps in the decision table
    • processing of the decision table

     

    As the structure for the decision table is constant there is no need to create the single data objects of the columns whenever a new decision table is created so as a prerequisite we have an additional BRF+ application of storage type S that holds the corresponding data objects for the weight, the performance and the premium amount as reusable data objects.

     

    Creating the decision table

    In order to create the decision table we first have to create a BRF+ application. We will use an application of type master data for the example as this is the obvious storage. For the task we design one method that creates the application including its sub-objects and returns a reference to it:

    Bild1.jpg

    Within that method we first fetch a reference to the BRF+ factory which is the central point when creating BRF+ objects:

    Bild2.jpg

     

    As already shown we then delegate to a method that creates the application with its basic features and returns the reference to it. The method contains the following logic (the numbers refer to the highlighted sections in the following screenshot):

    1. We call GET_APPLICATION on the BRF+ factory and get back a reference to an empty application object. Next we enqueue the application object itself by using the corresponding method of the interface IF_FDT_TRANSACTION.
    2. After that we set the basic parameters like application component and so on using the methods that are available on the application object. For every field that you can see on the BRF+ workbench one method is available in the application API.
    3. After that we call a specific API method that creates the master data application based on the properties we set into the object.
      There are also methods for the other types of applications like customizing applications.
    4. As we have now finished the application specific actions we activate the application via the method IF_FDT_TRANSACTION~ACTIVATE. Depending on the success or failure we save and dequeue the application or just dequeue it. In case of a success the application is now available in the BRF+ workbench.

    Bild3.jpg

     

    Some remarks on this first task:

    • The API also allows a deep activation and saving of objects (meaning that you trigger the saving e. g. on application level and all subsequent objects are saved too). This alternative will be shown in the following screenshots.
    • The naming of the application was determined using the class CL_FDT_SERVICES. This class offers methods that create GUIDs for your names either with or without a specific namespace and was called in the method GET_UNIQUE_NAME in the screenshot above.
    • Most of the BRF+ API methods throw the exception CX_FDT_INPUT. You should always catch and handle it

     

    Now let us go back to the main method for the creation operation. Next we want to create the decision table in order to have it in place when we create the function and assign it as top-level expression. But before we can do so we need to get access to the data objects used in the decision table i. e. their references that we have stored in the S-application for reuse:

    Bild4.jpg

    Within this method we once again use the BRF+ factory to fetch the references to these objects:

    Bild5.jpg

    If you know the IDs of the data elements (that I stored as constants in an interface) the reference is just one call of GET_DATA_OBJECT away. Having these references we have access to all the attributes of the data objects e. g. their type.

     

    Having the BRF+ factory the application object and the references to the data objects that build the columns of the decision table we can now go on and create the decision table via another private method:

    Bild7.jpg

    The creation of a decision table is roughly speaking the same as the creation of an application with the following steps (the numbers refer to the highlighted sections in the following screenshot):

    1. Creation a decision table object via the factory, enqueuing it and setting the basic properties. Each attribute that can be set is represented by a method of the decision table API.
    2. Filling of the decision table columns and setting their properties (e. g. is it a condition or a result column)
    3. Setting the corresponding objects to the decision table object
    4. In addition we also stored the object IDs of the context objects that will be used in the next task when creating the function

     

    Bild8.jpg

     

    As you can see we did not yet activate and save the object.

    Now we have the references to the application object, the (empty) decision table object and the data objects. We can therefore finish the creation operation by creating a function and activating the function and its sub-objects:

    Bild9.jpg

    This method encapsulates the following steps (the numbers refer to the highlighted sections in the following screenshot):

    1. First, as usual, we create a reference to a function object via the BRF+ factory. After enqueuing it we assign it to an application and use the methods of the object to set the basic attributes of a function object like the name and versioning. Last but not least we set the mode of the function to "functional mode". As with every BRF+ API the object reference has methods for each attribute that can be set.
    2. Next we assign the context and result data objects as well as the expression to the function. Remember that we fetched the object IDs of the context parameters when we constructed the condition columns of our decision table
    3. As third step we activate the function. This time we set the optional parameter IV_DEEP to ABAP_TRUE which leads to an activation of the function and its inactive sub-objects (in our case the decision table)
    4. Last but not least we save and dequeue the function in case of a successful activation. Once again we use the IV_DEEP parameter of the methods to propagate the action.

    Bild10.jpg

    That's it - executing the create method CREATE_MASTERDATA_APPLICATION  of the handler class ZCL_BRFPLUSAPI_FLEET will now create an application with function and empty decision table from scratch. The method returns the reference to the application object so we can for example store the GUID of the application (stored in the public attribute MV_ID of the reference to the application) in our master policy in order to be able to retrieve the application and its sub-objects at a later point in time.

     

    The key take-aways of this first section are:

    • Every BRF+ object has an API that can be used to create and maintain it.
    • In order to get an object reference to one of the APIs we need the BRF+ factory class CL_FDT_FACTORY.
    • Some global BRF+ service functionalities are implemented in the class CL_FDT_SERVICES

     

    Setting and Getting the Data

    After the successful creation of the decision table we want to offer methods in our handler class in order to read the entries of the decision table as well as to set data into the decision table.

     

    As a starting point let us assume that that we have the GUID of the application and have to retrieve the reference to decision table via that GUID. So we have to implement a method that fetches the references to the objects that belong to the application specified via the GUID:

    Bild11.jpg

    The content of the method consists of two major parts namely the fetching of the function object and then the fetching of the decision table object.

    For the function object the following steps are performed (the numbers refer to the highlighted sections in the following screenshot):

    1. First we get a reference to the query object of the BRF+ specific for functions. As for the BRF+ API a query object for each artefact exists
    2. Then we construct the selection parameters in our case the parameters of the application identified by its ID and specify that we are looking for a master data object
    3. After that we fire the query and get the ID of the function as a result.
    4. As a last step we use another important method of the BRF+ factory namely the method GET_INSTANCE_GENERIC that returns us a reference to the function object that we have to cast to the function-specific interface.

    Bild12.jpg

    As we have a 1:1 relationship between the function and the decision table we can directly fetch the reference to the decision table by using the method GET_EXPRESSION of the function API that returns the ID of the decision table object:

    Bild13.jpg

    After casting to the right interface we have the reference to the decision table that we can use for getting and setting the data. So we enhance the handler class ZCL_BRFPLUSAPI_FLEET with two additional methods for the getting and setting of data from/into the decision table object:

    Bild14.jpg

    The implementation of the GET method is straightforward as we call the corresponding GET method of the decision table API:

    Bild15.jpg

    The SET method is nearly as straightforward as the GET method so we call the SET method of the API but as we make a change to the table we have to enqueue/dequeue as well as activate and save the object in analogy to the case of its initial creation:

    Bild16.jpg

    As you can see the setting and getting of the data itself is not difficult but it is certainly also of interest what the table data has to look like in order to be able to call these two methods.

    The following coding snippet gives you some insight how this can be achieved:

    First we create a reference to our handler class and fetch the reference to the decision table object (the application ID is a parameter of the report).

    Second we fetch the type names of the single columns using the BRF+ factory as shown in the screenshots below

    Bild17.jpg

    Then we fill the table parameters with the values for the condition columns where we specify the table column and row the operators for the comparison and the concrete value:

    Bild18.jpg

    After that we fill the result column which is more straightforward due to the direct assignment of the result value without options for comparisons:

    Bild19.jpg

    This way we have to fill the table line by line and finally set it into our decision table using the handler class:

    Bild20.jpg

    The key take-aways of this second section are:

    • Having the ID of an BRF+ object enables you to search for other BRF+ objects using the query functionality (IF_FDT_QUERY) available via the BRF+ factory.
    • You can either instantiate a BRF+ object via the corresponding method for the object on the factory or you use the static generic method GET_INSTANCE_GENERIC of the factory.
    • Every API has methods to set and get the data and properties stored in the object. The concrete parameters depend on the specific object type.

     

    Summary and Outlook

    Within this blog a scenario was presented which is an example for the usage of the BRF+ API to create and maintain BRF+ objects. We created an application, a function and an empty decision table in storage type "master data". Then we determined the ways how to fetch the reference to the created objects via the ID of the BRF+ application and how to set/get data to/from the decision table object.


    As there are several more functionalities that you can address via the BRF+ API that might be of interest and are listed in the function list of the handler above a further blog will be available soon that describes e. g. the API to check for gaps and overlaps in the decision table.

    Handling of Applications with the BRF+ API (Part 2)

    0
    0

    Introduction

    Within the first part of the blog Handling of Applications with the BRF+ API (Part 1) we have done the first steps with the BRF+ API. They consisted of:

    • Create an application of storage type "master data" with a function and an expression. The function acts in functional mode and contains as top level expression a decision table. For the context and the result data objects we used existing data objects of a reusable S application
    • Learn how to get the function and the decision table using the ID of the application.
    • Find out how to get and set data from/into a decision table

     

    So what are the topics of this second part? Within the next sections we will take a look at how the BRF+ API can support us in order to

    • Check for completeness a decision table
    • Check for overlaps in a decision table
    • Export and Import the data of the decision table into Excel format
    • Process the function
    • Enforce code generation
    • Delete the application and its sub-objects

     

    So let us do some coding again!

     

    Checks on Decision Tables

    One very useful feature of decision tables is the check functionality you can execute on them namely the check for completeness and the check for overlaps.Thinking about our scenario with the integration of the decision table into a completely different application we certainly do not want to withhold this functionality to the end-user as they add so much additional value. So we add two new methods to the handler class ZCL_BRFPLUSAPI_FLEET that look like:

    Bild1.jpg

    Within these methods we call the corresponding functionality of the BRF+ API of the decision table. For both cases this functionality is given by methods of the interface IF_FDT_DECISION_TABLE_SERVICE.

     

    First let us take a look at the checks for overlaps and what to do in order to trigger them:

    Bild2.jpg

    It is quite obvious that it is really easy to achieve the check:

    1. First we get an instance of the decision table object we created (how this can be done via the application ID is described in part 1 of the blog) and do some casting.
    2. Then we call the CHECK_OVERLAP method of the service interface that does the job for us and returns the corresponding messages in case overlapping conditions exist.

     

    The check for completeness is nearly as straightforward but here we have the option to it in two ways:

    • We check for completeness and return the result that there are gaps which can be achieved by the method IF_FDT_DECISION_TABLE_SERVICE~CHECK_GAP.
    • We check for completeness and return the information what has to be done in order to fill the gaps in the table.

     

    As the second option is more convenient for the user we use that one in our implementation:

    Bild3.jpg

    So the following steps have to be taken:

    1. As in the other method we first fetch the decision table object via the BRF+ factory and do some casting.
    2. Then we call the method FILL_GAP_ROWS of the interface IF_FDT_DECISION_TABLE_SERVICE to get the proposal how to fill the gaps. This method internally calls the CHECK_GAP method.
    3. Last we fetch the result of the filling in textual form by calling the method GET_TABLE_DATA_TEXTS.

     

    To conclude we are now also able to do these checks on a decision table without usage of the BRF+ workbench and can, hence, integrate them into other UIs.

     

    Export/Import Functionality for Decision Tables

    Another functionality that is very handy when it comes to decision tables is the possibility to maintain them in Excel. As the concrete call of the Excel depends on the UI technology you want to embed the coding no step-by-step example is given here but you will see the core methods to fetch the information of the decision tables in a compliant format. The code snippets are given by:

    Bild4.jpg

    1. The central class for the excel transformation is the class CL_FDT_DT_EXCEL, so you first have to create an object of it. Within that class you have basically two methods to handle the excel tasks
    2. The method CREATE_EXCEL_FROM_DECTAB creates an XSTRING that contains the decision table data
    3. The method MODIFY_DECTAB_FROM_EXCEL is the counterpart to the prior one and modifies the decision table using the input from the excel file i. e. the XSTRING.

     

    If you want to take a look at the integartion of the Excel call for ABAP WebDynpro you should take a look at the component FDT_WD_DECISION_TABLE in the package SFDT_WD_EXPRESSIONS where the methods shown above are used for the Excel integration in the BRF+ workbench.

     

    Process the Function

    The coding for the processing of the function can be retrieved as usual via the code template in the BRF+. So you should create one “dummy” function for the scenario that we used here and then create the template using the BRF+ workbench or alternativly start the report FDT_TEMPLATE_FUNCTION_PROCESS.  After retrieving the coding you can delete the dummy function again and implement the call in the backend with the sample coding. The first  thing that has to be considered is that the ID of the function is not fix (as in the coding template) but has to be handed into the method of the handler as an importing parameter as shown in the following screenshot:

    PROCESS_FUNC_1.jpg

     

    The second point that has to be thought about is the handling of traces. If you need/want traces in your scenario is specific to the requirements you have and no sample solution can be given. However you decide the code template can be generated in a manner to support traces (more information on taces can be found e. g. in the document Tracing in SAP Decision Service Management).


    No further tasks are necessary to expose that functionality except that you want to rewrite the proposed code tameplate in ABAP 7.40 style:

    PROCESS_FUNC_2.jpg

    But this is indeed an optional task  

     

    Enforce Code Generation

    As additional functionality we want to implement a method that enforces the code generation of the BRF+ function. Now you might ask why one should do that as this is automatically triggered when the function is called for the first time (the very first execution of the function is in the interpretation mode). This type of execution can cause issues when the BRF+ function is called the first time from parallel processes which might be the case in our scenario. So to be on the safe side and to have the optimal performance of the function call right from the start we add the following method to the BRF+ API handler class:

    Generate Source 1.jpg

     

    The method contains a call of the function module FDT_CC_GENERATE_FUNCTION which triggers the code generation:

    Generate Source 2.jpg

     

    Please be aware that the method here contains an explicit COMMIT WORK that might be placed in another place when integrating it into the master policy application in order not to interfere with the commit handling implemented there.

     

    Deletion of Applications

    Up to now we have implemented the creation of the BRF+ objects, the update of the data in the BRF+ objects and some read functionality (and service functionality like the check for overlaps). So to complete the CRUD paradigm we add a delete method in our BRF+ API handler class:

    DEL_APP_1.jpg

     

    The deletion functionality is already encapsulated in the class CL_FDT_DELETE_HANDLING. In analogy to the process in the BRF+ workbench the deletion of the objects via the BRF+ APIs consists of three steps:

    1. First we mark the complete application with its sub-objects for deletion by calling the method MARK_FOR_DELETE_VIA_JOB as shown in the green colored area. A prerequisite is to fill the importing parameter for the seletion appropriately as shown in the red colored area:DEL_APP_2b.jpg
    2. Then we logically delete the jobs using the method DELETE_LOGICAL_VIA_JOB:DEL_APP_3.jpg
    3. The last step is the physical deletion of the objects that we trigger via the method DELETE_PHYSICAL_VIA_JOB:DEL_APP_4.jpg

    As usual the error handling has to be adopted according to the specific needs.

     

    The parameters handed over to the three calls of the class CL_FDT_DELETE_HANDLING are restricted to be used with a master data application (obvious in the very first screen shot when taking a look at the parameter LS_OBJECT_CATEGORY_SEL). So in case you use the API for the other storage types the concrete values have to be adopted.

     

    One further hint when you play around with this functionality as I came across it several times especially when testing this functionality (It was some kind of Homer Simpson like experience … d'oh): The methods from above check if there is a lock on the objects that are to be deleted. So if you execute the coding and have the object open in the workbench in edit mode (which is default if you personalized your workbench as expert) the methods will return an error that they are locked. Do not ask how often that happened to me when I played around with the parameters of the methods as I wanted to check if they really are deleted    

     

    Additional Information

    As already mentioned this blog is intended to introduce you into the possibilities of the BRF+ API and is far away from describing all the possibilities and options the BRF+ API offers. So if you want to make a deeper dive into the topic one very good description is given in the book BRFplus – The Book ofCarsten Ziegler and Thomas Albrecht.

     

    There is also plenty of information directly accessible directly in your system:

    • Many demo reports are located in the package SFDT_DEMO_OBJECTS that show the usage of the API in different scenarios:SFDT_DEMO_OBJECTS.jpg
    • Another source of knowledge is available via the reports that are grouped in the transaction FDT_HELPERS. Here you will certainly find useful coding snippets for your work with the BRF+ API:

    FDT_HELPERS.jpg

     

    Conclusion

    Concluding this two-parted blog I hope I the information described in it gave you same insight into the BRF+ API and supports you in having a smooth start with the generation of BRF+ objects.

     

    As soon as you start to work with the API you will certainly experience that its design is very well elaborated so after getting used to it you will find the methods you need at the place where you would expect them, i. e. all is very intuitive which is unfortunately not always the case for all APIs out there. Many thanks to the BRF+ team for that good job!


    P.S. You certainly have become aware of is that the coding was done in Eclipse using the ABAP Development Tools. Working with these tools for some time nowI heavily recommend to use them as they will speed up your work. So I like to addtionaly state (although this is not part of the ADT area in SCN) that the team around Thomas Fiedler did a great job on those tools (and hopefully will continue to do so, but have no doubt on that).


    So the BRF+ API together with ABAP in Eclipse is indeed developing like never before

    What was your Recipe for Success for Learning BRFplus?

    0
    0

    In the last months I spent much time helping people in the BRM forum on SCN. I realized how many devlopers started with BRFplus and made quick success. At a certain time typical problems occur: how do you transport rules? How do you analyze transports requests? How do you trace the results? Obviously DSM can help in these cases. Nevertheless many people are mastering BRFplus within shortest time and other people need more time. What is reason for it?

     

     

    How did you learn BRFplus? What have been difficulties? How did you overcome them?

     

     

    Before I present my personal point of view I would like to hear your story:

    • How did you start with BRFplus/DSM?
    • What was helpful for you?
    • What was misleading?
    • What was your breakthrough to success?

     

    I would really appreciate if you share your experience in the comment section of this blog or start blogging about this topic.

     

    My Personal Story

     

     

    At first I was overwhelmed by the complexity of the framework and had many questions: which storage type have I got to choose? How do I structure my BRFplus application? What expression types should I choose?

     

     

    Then I asked myself whether I should think of BRFplus as a framework or library. This is in fact a huge difference: most frameworks define rigid boundaries and you need strict development guidelines for using them properly. The reason is that frameworks rely on the principle of “inversion of control” – they call your code which is in fact a kind of plugin. And in fact BRFplus has some properties of a framework since it has en entry point (a function) and is modularized
    using rule sets.

     

     

    But then I realized that I can use BRFplus as a library that defines a language consisting of rules, formulas, Boolean expressions and decision tables. And then I used those expressions to formulate the business logic and I tried it to do so that business experts can understand it.

     

     

    From this moment on it became very simple: I sensed a feeling of success. I started to refactor my rules and learned about using the UI efficiently. Later I learned more expression types, learned about rule logistics and extensibility and experts like Carsten Ziegler and Wolfgang Schaper gave me valuable advice. In fact they already wrote down most of their knowledge here so I read their whitepapers. So let me conclude: I succeeded when I stopped thinking of BRFplus as a framework and thought of it as a library.

     

     

    Danger of Frameworks

     

     

    Please don’t understand me wrong – I appreciate frameworks and SAP invented some very powerful framework like BOPF for example.

     

    But frameworks have their dangers and problems: They are difficult and it takes some time to master them.  Once they are mastered developers get highly specialized and I learned that many team and development leaders don’t encourage their developers to learn about new frameworks and libraries and don’t promote that their developers learn something new.

     

     

    BRFplus is a Toolbox – not a Framework - and you have to use it that way

     

     

    From the point of view of a software architect BRFplus is a sophisticated framework and I encourage everyone who is interested in good ABAP development should to study its internal structure and design patterns. And you have to understand These patterns when you extend BRFplus But you shouldn’t think about it as a framework when you are creating rule systems and decision services. As I already mentioned I recommend to think about it as toolbox, language or library.

     

     

    Unfortunately many ABAP developers know more frameworks than libraries and most of the time they work within borders of frameworks like BDT, enhancements of SAP standard software and so on. But this has to change in the future since I think the time of crating frameworks at SAP is over in my opinion. In fact SAP has many really mpressive and powerful frameworks but more and more libraries are developed at he moment.  The reason is simple: libraries are more flexible and allow people to create their own individual UIs and processes and this is the direction SAP is going right now.

     

     

    Maybe you have a different opinion about this and I would like to discuss this as well but it is my belief that it will help getting successful with BRFplus.

    November 2014: Reader’s Digest for SAP Decision Service Management (DSM) and Business Rule Framework plus (BRFplus)

    0
    0

    Here is a new edition of my digest blog series providing an overview of the content I found when dealing with SAP Decision Service Management (DSM) and Business Rule Framework plus (BRFplus).

    Product Modeling for the Utilities Industry

    DSM/BRFplus is tightly integrated into the Utilities industry solution. Several sources of information exist:

    Misconceptions about SAP Decision Service Management vs. BRFplus

    This is a very important blog as it contains a list of questions that are often asked about DSM and BRFplus. And of course the blog also gives the answers.
    Thanks Wolfgang for putting this together.

     

    SAP Decision Service Management or How to Get Rid of Custom Code

    This is the video of a talk by Dr. Christian Lechner. He introduces DSM and he explains how it can be used to reduce or eliminate custom code. In 2013 I wrote a blog about this topic, How to Kill Custom Code and Z-Tables. Dr. Christian Lechner was obviously inspired by this blog. However, in his talk he dives much deeper into the topic than I did in my blog.

     

    Handling of Applications with the BRF+ API

    DSM/BRFplus superhero Dr. Christian Lechner showed his superhero skills in a little blog series on how to use the BRFplus API. It’s very well explained and very instructive!
    Handling of Applications with the BRF+ API (Part 1)
    Handling of Applications with the BRF+ API (Part 2)

     

    Tracing in SAP Decision Service Management

    Over the last 12 months we have improved the tracing capabilities of DSM/BRFplus significantly. Unfortunately, roll-out of the information fell behind. This gap has now been closed with a new document.

     

    SAP Decision Service Management Test Case Tool

    The DSM test case tool was finally released in summer 2014. Thanks to Wolfgang Schaper we have a document that explains its capabilities.

     

    Usage of BRFplus in BW Transformations – Quick Wins and Challenges

    Once again a very informative and helpful contribution by SAP mentor Tobias Trapp.

     

    A Quick Look at DSM HANA Expressions using Dynamic Database View

    And another blog from Tobias Trapp. This time he pioneered writing about the Dynamic Database View feature in DSM. Some very hard work is being done on HANA capabilities. At TechEd in Las Vegas and Berlin we already gave a preview of the next level that is much more powerful than what Tobias describes here.

     

    What Was Your Recipe for Success for Learning BRFplus?

    This is a very interesting blog by Tobias Trapp asking the readers to comment on how they learned to use BRFplus. I hope there are many responses to his blog. I know that there are 2-3 sources that lead to misunderstandings and problems. Let’s see if this is confirmed by the commenters. Over the last few years I used training courses to observe users of DSM/BRFplus to work out usability improvements with the development teams.

    Viewing all 119 articles
    Browse latest View live




    Latest Images