Quantcast
Channel: SCN : Blog List - ABAP Development
Viewing all 943 articles
Browse latest View live

The case of cl_http_utility=>decode_x_base64

$
0
0

Hi,

 

Lately there was a thread of converting BASE 64 TO PDF .

 

I was intrigued so I decided to do some tests (We might get any kind of data this way) .

 

Using Java I generated a Base64 text file (this will be used as input ) .

screenshot_01.png

This is how the output file looks like (1,060,588 byte long string)

screenshot_02.png

Time for some abap code:

 

Program steps ( Y_R_EITAN_TEST_31_10 ) :

 

- Using cl_gui_frontend_services=>gui_upload to upload the text file.

- Using cl_http_utility=>if_http_utility~decode_x_base64 to decode the data the output from this method is xstring .

- CALL METHOD cl_gui_frontend_services=>gui_download to write the xstring as PDF .

 

I have new valid PDF file named "An Easy Reference for ALV Grid Control.new.pdf" .

 

Comparing the files using MD5 tell me that the files are not identical !!!!

 

screenshot_03.png

So I used a binary editor to compare:

screenshot_04.png

There is some extra bytes in the new file .

 

This does not interfere in this case but might cause a problem with other file formats and it is annoying...(Any idea ???? )

 

This all for now.

 

regards.


Simple generic popup selection screen for quick input or output

$
0
0

Hola,

 

Today I'm going to share with you some tiny class that can save up much time when you develop something rapidly.

 

For those guys who accepts generic programming it will be a good present as well.

 

So with the help of the class CL_CI_QUERY_ATTRIBUTES we can raise a screen like this:

 

upload_2015-01-27_at_1.05.03_pm.png

with the code like this:

 

report ztest_generic_screen.
start-of-selection.  perform main.
form main.  data: lv_bukrs type bukrs,        " select-options: type range of.. (must be dictionary type)        lrg_werks type ckf_werks_table,        " select-options: table, separate values        lt_werks TYPE plants ,        " checkbox + radiobutton ( must be DDIC types )        lv_simulate type xfeld,        lv_mode_open type xfeld,        lv_mode_close type xfeld,        lv_language TYPE spras.  lv_language = sy-langu.  " Generic screen call  cl_ci_query_attributes=>generic(    exporting      " unique screen ID      p_name       = conv #( sy-repid )      " Screen title      p_title      = 'Generic screen for input'      " Screen fields      p_attributes = value #(        " parameter field       ( kind = 'S'         obligatory = abap_true         text = 'Company code'(001)         ref = ref #( lv_bukrs ) )       " select-option       ( kind = 'S'         text = 'Plant'(002)         ref = ref #( lrg_werks ) )       " selec-option no intervals       ( kind = 'T'         text = 'Additional plants'(008)         ref = ref #( lt_werks ) )       " Screen groupd       ( kind = 'G'         text = 'Mode'(006)         ref = ref #( SY-INDEX ) )       " radiobuttons       ( kind = 'R'         text = 'Open'(004)         button_group = 'MOD'         ref = ref #( lv_mode_open ) )       ( kind = 'R'         text = 'Close'(005)         button_group = 'MOD'         ref = ref #( lv_mode_close ) )       " checkbox field       ( kind = 'C'         text = 'Simulate'(003)         ref = ref #( lv_simulate ) )       " listbox field       ( kind = 'L'         text = 'Language'(007)         ref = ref #( lv_language ) )       )    " Table of Attribute Entries      p_display    = abap_false    " General Flag  ).
endform.

 

As you see we can use checkboxes, radiobuttons, listboxes, screen groups, obligatory option.

 

I don't pretend this class to be used in serious programs, as of course only screens provide all the necessary stuff.

 

But if we speak about some simple screen to be raised from the class - this is kind of a fast solution avoiding screen and GUI status creation.

 

Good Luck,

 

I hope you liked it!

 

Adios.

Code Review: Success factors

$
0
0

We often hear that code reviews are frustrating and waste of time. Really?? Or is the lack of adoption of a suitable, well-defined process that is the root cause of it appearing futile. Think again!!

 

There are many articles highlighting the importance of doing code reviews, listing out to-do and not-to-do instructions, as well as explaining different code review alternatives (automated, peer review etc.), hence I am not detailing those here.

 

Through this blog, I want to stimulate the thought and highlight the importance on "how" the "right" code review process tailored per your "organization structure and need" play a significant role in embracing the code review mind-set within the project team and stakeholders.

 

By "right" code review process, I mean it is accepted and easily integrated within your software development/release life cycle (SDLC). It shouldn't look disparate, additional step or hindrance.

 

To evolve to "right" and robust code review process, your first step is to ensure that the code reviews are encouraged and it happens. This is truly possible only if there is awareness on its importance and have buy-in from key stakeholders. It is not only the developers that drive it but sincere acceptance and encouragement from Project Managers, Business leads, Analysts and End-users that primarily contribute to its success.

 

I am part of SAP Development team and feel proud to say that in my current organization this process is thoughtfully customized, neatly defined and well-integrated with other phases of SDLC.

 

We have clearly written code review check-list, coding standards document that is easily accessible to stakeholders. These documents contain answers to FAQs, security related coding norms, tips/pointers to improve performance and also inputs to write code for easy maintainability and support globalization/reuse.

 

To effectively implement the code review process, we have a dedicated code review team which is an independent, unanimously recognized group that governs the process and the coding standards. It functions across modules and projects and owns accountability of code changes moving to production environment.

 

The code review process is also tightly integrated with the subsequent Change Control process (process that focuses on moving code changes to production) in SDLC. The change control mechanism checks for code review status and warns when a code that is not reviewed is proposed to move to Production. In such case, it alerts and triggers action points for stakeholders to take appropriate action.

 

I am highlighting key points that have worked well for us

 

Mind-set:

To develop this mind-set, project team is kept informed and educated via different forums(seminars, blogs, trainings and question hours) to have their buy-in and feedback. This has helped in evolving to "Right" process.

 

The Team:

Forming of "dedicated", "independent", "unbiased" group is key to its unanimous acceptance. The role and responsibility of the team is clearly defined and accepted. The team has a good mix of experienced professionals having wide-ranging technical understanding.

 

Documentation:

The content is simple, precise and easy to understand without missing out on the exceptions/specifics. It is easily available to all. Any updates to it and its communication are well governed.

 

Deep integration:

The process is thoughtfully tailored per our need and is well-integrated with other phases of Software Development Life Cycle (SDLC). Though an independent process, it is an integral part of SDLC.

 

With the above thoughts on the code review process. I open up for further discussion and invite you to share your experiences on it within your organization.

Execute Searchhelp with Search Exit in Background

$
0
0

Hy everybody,

 

i recently had the requirement to get the values of an existing searchhelp. Pretty simple when u are dialog modus. But unfortunatly the requirement was, that the search help had to be called in backgroud without opening the searchhelp dynpros (it's called in a oData Service Implementation). I didn't want to recreate the existing search logic, because it wasn't a simple search help. Most of the logic has been put in a search help exit and i didn't want to write any kind of wrapper by myself or implement the search logic again. I would like to share following code with u, so next time somebody has this requirement, less research has to be done as i found nothing on scn. For my requirement this solution works fine, as all dynpro fields are supressed. I think you can use it as approach when u have similiar requirements. I've also provided the source as attachment.

If you are using this approach, test it well, as i made the experience, that search helps can be called in various ways & behaviour might be different on your system depending on release & user settings.

Any comments & improvements are kindly welcomed.

 

Have a nice day.

 

Best regards,

Michael

 

*&---------------------------------------------------------------------*
*& Report  ZZTAB_SHLP_BG *&
*&---------------------------------------------------------------------*
*& Albrecht Michael, 28.01.2015 *&
*&---------------------------------------------------------------------*
REPORT zztab_shlp_bg.
DATA:
lv_search_help      TYPE ddobjname,
lv_search_help_f4if TYPE shlpname,
lt_return_tab        TYPE STANDARD TABLE OF ddshretval,
ls_search_help_infos TYPE dd30v.
PARAMETER: p_shlp TYPE string DEFAULT 'Z_YOURHELP'.
START-OF-SELECTION.
lv_search_help = p_shlp.
lv_search_help_f4if = lv_search_help.
* Get Search Help Information
CALL FUNCTION 'DDIF_SHLP_GET'
EXPORTING
name          = lv_search_help
IMPORTING       dd30v_wa      = ls_search_help_infos
EXCEPTIONS
illegal_input = 1
OTHERS        = 2.
IF sy-subrc <> 0.
* Implement suitable error handling here
ENDIF.
* Call Search Help in Background
CALL FUNCTION 'F4IF_FIELD_VALUE_REQUEST'
EXPORTING
tabname             = ls_search_help_infos-selmethod
fieldname           = ''
searchhelp          = lv_search_help
value               = '*'
display             = ' '
suppress_recordlist = 'X'
selection_screen    = ' '
TABLES
return_tab          = lt_return_tab
EXCEPTIONS
field_not_found     = 1
no_help_for_field   = 2
inconsistent_help   = 3
no_values_found     = 4
OTHERS              = 5.
IF sy-subrc <> 0.
* Implement suitable error handling here
ENDIF.

Smartforms & Logos

$
0
0

Dealing with logos in Smartforms has always been a big problem. It never matches the client's requirements and as a result it's always a support call that I've dreaded receiving.

 

The first problem is uploading with a normal 24bit bitmap version of your source image.

Untitled.png

Which gives you this staggered gradient and forces you to have an off-white background to the logo on all forms. Definitely not what the client wants...

 

All over the internet, the solution is to change your logo to 256 colors to make the background of the logo completely white. But then you end up with this.Untitled.png

Which looks even worse in my opinion. Don't despair however. There is a solution. It's called Dithering.

 

I'm providing the Gimp tool's screenshots since it's completely free and comparable to Photoshop, if only for it's functionality and not it's user-interface. So step 1, open the source image in Gimp and then click on Image > Mode > Indexed.

 

Untitled.png

 

Then set up the settings as follows; Note the Dithering option.

 

Untitled.png

 

Finally, choose File > Export As and select Windows Bitmap. No need to change any compatibility settings as your image is already on 256 colors.

 

And here's the result.

 

Untitled.png

That's pretty close to the original if you ask me, especially at this scale.

 

Hope this was useful.

 

As a final note, try and fit your image onto the Smartform at 100dpi once you've made the above changes. I've noticed that SAP's image compression when you up the DPI makes the logo look terrible again.

Why Object Oriented? - Encapsulation

$
0
0

I've been away from coding for some time (on project management tasks) so I haven't been able to write as much as I would like, but recently I had the time to do some coding and again I was reminded of the benefits of going object oriented. This time - encapsulation, something I indirectly wrote about in my previous blog (Why object oriented? - Class Attributes), but I feel it's a topic that's more generic and deserves its own entry.

 

To provide some context, I was doing some work related to material valuation, where I wanted to discover the mean average price for a given plant over the last few year considering only goods receipts and stock migrations.

 

I had an object for the material movement line item, and I has making a sum of all the values /quantities, and then calculating na average.  In terms of code I had something like this:

 

data(lt_movements) = ZCL_MOVEMENT_ITEM=>GET_ITEMS_WITH_FILTER( filter ).
Loop at lt_movements into data(lo_movement)     lv_total_quantity = lv_total_quantity + lo_movement->get_quantity(  ).     lv_total_value = lv_total_value + lo_movement->get_value(  ).
Endloop.
Lv_map = lv_total_value / lv_total_quantity.

 


While I was testing the program I discovered a bug related to STOs, where the value of the transfer is posted in the 641 not the 101. I had to change the GET_VALUE( ) method to take into consideration this logic.  If you extrapolate to a situation where the GET_VALUE( ) was used in multiple places you can easily see the benefit of encapsulation.

 

But why is this specific to object oriented programming? I can encapsulate in procedural programming too right? Yes you can, but with some drawbacks:



     1.   Too Verbose

 

The equivalent procedural code, just for one attribute would be something like:

 

   perform get_value_of_movement_item using lv_docnumber                                                                                     lv_docyear                                                                                     lv_docitem                                                                  Changing lv_value.  lv_total_value = lv_total_value + lv_value.


Not only does it take more work to write the code (and laziness takes over), it's harder to read.

 

     2.    Lack of general rules

 

If you consider that GET_VALUE( ) only has (in the first version) a SELECT statement to MSEG, you can easily concluse that many programmers won't bother creating a FORM just for the SELECT, they will write the SELECT directly in the LOOP (forget the FOR ALL ENTRIES discussion please, not the point).

 

You can then say "I create a FORM for every DB access", but this is just one example. The GET_VALUE( ) could return a calculation between attributes of the object:  lv_value = me->quantity * me->unit_price. Don't try to convince me that a procedural programmer would create a form just for a multiplication.

 

In Object Oriented Programming there are rules to enforce this, I don't have to think:

  • Every characteristic of the object is accessed through a getter: This prevents me from putting the quantity * net_price outside my class. I use charateristic and not attribute to separate what is a formal attribute of the class and what is a characteristic of the movement line item. For example in my case, the value of the line item was not an attribute of the class;

  • Every DB access must be made inside the respective class: This prevents me from having a rogue SELECT * FROM MSEG somewhere in my code, instead of retrieving the value from the class via getter.

 

I don't have to think if GET_VALUE( ) is only a SELECT or 100 lines of code, it has to exist according to OO rules, and the only way to get the value of the movement is through GET_VALUE( ) so there is only 1 point of access to  update.

 

Encapsulation is extremely important because things change, and like in my example, what in the beginning was simply a SELECT FROM MSEG, later changed into something that had to have a different behaviour for STOs.

 

PS: I know I take a hit in performance due to encapsulation, but having scalable and bug free code is a priority for most of the project I handle.

Using Google URL Shortener service via ABAP.

$
0
0

Sometimes you need to use short URL instead of the full URL. For example your URL in a table printed inside a PDF form with adobe livecycle, may be very very long, and sometimes is not so beautiful (or not working at all).

 

How to do that in ABAP?

 

There are different ways, as you can see from this great blog post from Ivan Femia

 

Anyway you may know, Google offer this service for free if you stay under the following limit:

 

1,000,000 requests/day


So... we all guys know Google as one of the best service providers in the world, why don't we leverage this opportunity?


Maybe you know this blog post: Integrating Google Glasses with SAP


As I explained there, the GOOGLE API ABAP CLIENT is already supporting Google Glasses, and has the same structure of the standard Google Php APIs, so I could add the support for the Google URL shortener service just adding few code lines and two abap classes!


And in just 1 hr, we got out functionality



1.1.PNG


1.2.PNG


2.PNG


If you have any other google services that may be useful for your business processes, just drop me an email and I will extend the framework, or join directly the project on GitHub here: Gh14Cc10/google-api-ABAP-client · GitHub


----


To make the new functionality working, just clear (eventually) the table zgoogle_access (this is required because the requested access token has a different scope from the one requested with the google glasses demo report), and run the demo report ZGOOGLE_TEST_URLSHORTENER.



International Editable SALV Day 2015

$
0
0

no change.png

 

 

http://scn.sap.com/thread/733872

 

Dear Programmers,

On the 8th of February 2008 James Hawthorne started the discussion (link above) as to why SAP does not give the CL SALV TABLE the option to be editable, as CL GUI ALV GRID is.

Today is the 7th anniversary of that blog, and hence is “International Editable SALV Day”.

 

I started a little discussion on this here on SCN last June:-

 

http://scn.sap.com/thread/3567633

 

Back in the year 2003 I went to SAPPHIRE in Brisbane and the speakers from SAP made much of the convergence of reports and transactions.

 

To put that another way – people did not just want a list of blocked sales documents where the system could not determine the price. They wanted a list of such documents on a screen where you could actually change the price yourself to fix the problem, and then release the document there and then.

 

The obvious answer is an editable grid, rather like the table controls in classical DYNPRO.

 

In CL_GUI_ALV_GRID this was a piece of cake. You could set whatever cells you like to be editable, and programmatically add extra custom commands at the top of the screen e.g. a custom “release” button.

 

The CL_SALV_TABLE was supposed to be the successor of CL_GUI_ALV_GRID and SAP pushed us to use that instead. Whilst far better in many ways, CL_SALV_TABLE has some surprising drawbacks, making it actually have less functionality than its predecessors.

 

·         You cannot edit the data

·         You cannot programmatically add commands in the full screen mode

·         You cannot add separator lines between icons

 

All of this was available in CL_GUI_ALV_GRID. The really strange thing is that at the heart of a CL_SALV_TABLE instance is a a CL_GUI_ALV_GRID instance, so obviously all of the above functionality could be added if it was so desired.

 

You will see from the prior blogs that the history of this is as follows:-

·         Every single ABAP developer desires this extra functionality in CL_SALV_TABLE

·         Many workarounds have been proposed on assorted SCN blogs

·         SAP development looks for those workarounds, and changes the CL_SALV_TABLE to block them

·         People tried to do the right thing, and put a proposal on “idea place” to make the SALV editable, a proposal which got a lot of votes

·         A senior SAP person from the development department got very upset indeed, said the SALV was never meant to be editable, and closed off the idea.

 

To quote from “V for Vendetta” however you cannot kill an idea.......

 

Ideas are Bullet Proof.png

 

So eight years on and workarounds are still springing up. SAP development can close off certain avenues but with the enhancement framework it is possible for developers to get around virtually any barriers placed in their way.

 

Wouldn’t it be easier if this was not a fight between the central team at SAP who develop the ABAP language and the users of that language?

 

Someone once said that if there was a law, and virtually everybody broke that law on a day to day basis, would it not be worth looking at the law to see if it actually made sense?

 

I don’t mind admitting I have created several SALV reports in my company where the users can edit data and I imagine I am not alone. I am also sure SAP development would be horrified by this fact. They hate people doing workarounds, cannot fathom my anyone would do such a rule-breaking dangerous risky thing, just to keep the business users happy.

 

As I said at the end of my last posting on this subject might I humbly suggest to the red nosed SAP ALV development team that the easiest way to stop people doing workarounds is to remove the need for a workaround in the first place, by having the functionality as standard.

 

So the question is – in 12 months’ time will I be posting another blog to celebrate the 9th anniversary of International Editable SALV Day?

 

Cheersy Cheers

 

Paul

 

 


Procedure to upload an excel data into an internal table while debugging in SAP ABAP.

$
0
0

Hello SCN Members,

 

Good Morning,

I have faced a situation, where in SAP Development system does not have data and Testing system has data. So, i am trying to add data on the internal table manually and it is taking much time. so, i tried to copy only changed data in a row from 3rd column to end of the column on the excel sheet. Like that i have done many times and after that i have got an idea and do as below:

 

While debugging, wherever i need to fill data to an internal table i put  break point and filled the data by uploading the excel which was downloaded from the Test server and the process is as follows:

 

First put a break-point where we need to fill data to an internal table. So, break point was stopped.see the below picture in debugging - standard tab,services on the Tool as high lighted below and click on the internal table you will get the data as below:

 

 

sample1.png

you will get the data as above,

 

go to services on the tool icon as high lighted below: Internal table here is Lt_Final with 6 records.

sample2.png

you will get the pop-up as below: Under Table display services, Services<Standard<

 

sample 3.png

 

 

Double click on it to upload the test server data, you will get the below pop-up message, click on yes button as below.

 

sample4.png

after that again you will get the below pop-up message and click on 'Allow'.

 

sample 5.png

 

After that it will fill the data to that particular internal table .(here in this case internal table (LT_Final with 40 records).

 

With that, i did my program changes successfully. This blog will be benefited to our members a lot i hope.

 

Regards,

Siva kumar.

Delivery Picking, Packing, Unpacking, Goods Issue.

$
0
0

As I was working to create a program that would do the picking , creation of handling units , packing of HU’s , unpacking of HU’s  , delivery/picking Quantity
update , I found it rather very difficult to execute the Bapi’s that were provided or use the function modules that I found while debugging the standard.

 

The Function Modules that I found while debugging the standard, would not work when executed stand alone, it required another FM that would help in buffering the data to global values.

 

Also after referring, the OSS 581282 Note,it clearly says that BAPI_HU_CREATE,BAPI_HU_DELETE,BAPI_HU_PACK,BAPI_HU_REPACK, BAPI_HU_UNPACK do not make updates on delivery and hence cannot be used for packing and also In the same way it is not possible to pack
deliveries with function modules of function group V51E (HU_CREATE_ITEM, HU_CREATE_ONE_HU,HU_DELETE_HU, HU_REPACK, HU_UNPACK).

 

Hence there is a way that I have used a FM to do various scenarios related to the delivery and I hope that it might help you all in future.

 

  1. Picking and Packing Materials to Handling
    Units

 

 

We can use FM WS_DELIVERY_UPDATE to update the picking quantity in order to complete the delivery and also to pack the materials to Unique Handling Units created.

 

In table lt_hvbpok various details need to be added to the table, specially keeping in mind the following fields

 

 

        lt_hvbpok-vbeln_vl = delivery no.
       lt_hvbpok-posnr_vl = delivery item.
        lt_hvbpok-posnn    = delivery item.
        lt_hvbpok-vbeln    = delivery no.
        lt_hvbpok-vbtyp_n  = 'Q'.
        lt_hvbpok-pikmg    = Qty to be picked.
        lt_hvbpok-lfimg    = Qty to be picked.
        lt_hvbpok-lgmng    = Qty to be picked.
        lt_hvbpok-meins    = Qty to be picked unit.
        lt_hvbpok-ndifm    = 0.
        lt_hvbpok-taqui    = ‘X’.
        lt_hvbpok-werks    = Plant.
        lt_hvbpok-lgort    = Storage Location.
        lt_hvbpok-matnr    = material.
           

    

If you need to pack the materials to specific handling units, they can be created by using FM BAPI_HU_CREATE. Once the External Ids are created you could use the following FM to attach materials  to these handling units. The following fields need to be taken care of.

 

lst_verko-exidv =   external handling unit id.

 

(This can be multiple as one delivery can have n no if handling units)

 

lst_verpo-exidv_ob = HU external id.
               lst_verpo-exidv    = HU external id.
               lst_verpo-velin    = ‘1’
               lst_verpo-vbeln    = delivery no.
               lst_verpo-tmeng    = material quantity to be packed.
               lst_verpo-matnr    = material no.
               lst_verpo-werks    = plant.
               lst_verpo-lgort    = storage location.

 

Example :

CALL FUNCTION 'WS_DELIVERY_UPDATE'
     
EXPORTING
        vbkok_wa                   
= lst_vbkok
        synchron                   
= 'X'
       
commit                      = 'X'
        delivery                   
= Delivery no
        update_picking             
= 'X'
        if_database_update         
= '1'
        nicht_sperren              
= 'X'
        if_error_messages_send_0   
= 'X'
     
IMPORTING
        ef_error_any_0             
= lw_ef_error_any
        ef_error_in_item_deletion_0
= lw_ef_error_in_item_deletion
        ef_error_in_pod_update_0   
= lw_ef_error_in_pod_update
        ef_error_in_interface_0    
= lw_ef_error_in_interface
        ef_error_in_goods_issue_0  
= lw_ef_error_in_goods_issue
        ef_error_in_final_check_0  
= lw_ef_error_in_final_check
        ef_error_partner_update    
= lw_ef_error_partner_update
        ef_error_sernr_update      
= lw_ef_error_sernr_update
     
TABLES
        vbpok_tab                  
= lt_hvbpok
        prot                       
= lt_prot
        verko_tab                  
= lt_verko
        verpo_tab                  
= lt_verpo.

 

2. Unpacking

 

Now once the materials are packed to the handling units, what if you need to unpack them and repack them to different handling units.

 

There are mainly 2 approaches.

  1. You could use the same FM  WS_DELIVERY_UPDATE and pass the table it_repack where the source and dest HU are according to the requirement. Source and Dest HU are nothing but External Handling Units.

 

So consider a case where M1 is packed to Handling Unit H1 and for
some reason you need to repack the material M1 to Handling unit H2. In that
case your source HU will be H1 and dest HU will be H2 with the material details
and its quantities.

  B. Second approach, which I took, as my program requirement required it, was to completely unpack the materials from the handling units, in this case we           can again use the same function module  WS_DELIVERY_UPDATE but here you need to send the quantity as –QTY. The minus plays a crucial
          role as otherwise ull keep wondering ways why this FM is not working.

 

So remember, pass the same values in table lt_verko and lt_verpo  as above but send the quantities with a ‘-‘sign. : D

 

3. Update table Vepo/Vekp for HU’s

 

The best way to work around this is to use the FM V51S_HU_UPDATE_DB.

 

In this FM you can create, delete or update entries in table vepo or vekp. Even if the there is change in qty, it can be encountered by calling this fm .Do not forget to use  commit in the end ;). You might get an error stating that one of the tables is not defined. For this programmatically you need to declare all the tables as below even if you are passing values only in one of the table.

           CALL FUNCTION 'V51S_HU_UPDATE_DB'
             
EXPORTING
                it_hdr_insert
= lt_ins_vekp
                it_hdr_update
= lt_upd_vekp
                it_hdr_delete
= lt_del_vekp
                it_itm_insert
= lt_ins_vepo
                it_itm_update
= lt_upd_vepo
                it_itm_delete
= lt_del_vepo
                it_his_insert
= lt_his_ins
                it_his_update
= lt_his_upd
                it_his_delete
= lt_his_del.

 

4. Change in Delivery Qty of material for a delivery.

 

What If you need to update the delv qty and picking qty of a delivery , you could do so by passing the table vbpok_tab where the new delivery quantity and picking quantity needs to be passed.

    
lst_vbpok_tab-pikmg    = picking qty
      lst_vbpok_tab-lfimg    = Delivery qty.

 

5. Goods Issue

We can also do goods issue using the same FM. All we need to do is pass vbkok_wa structure and in that lst_vbkok-wabuc    = ‘X’ should be marked. Also
the table vbpok_tab needs to be populated.

 

CALL FUNCTION 'WS_DELIVERY_UPDATE'
     
EXPORTING
        vbkok_wa                          
= lst_vbkok
        synchron                          
= ‘X’
*       NO_MESSAGES_UPDATE                 = ' '
       
commit                             = ‘X’
        delivery                          
= Delivery no
        update_picking                    
= ‘X’
        nicht_sperren                     
= ‘X’
        if_error_messages_send_0          
= ‘X’
     
TABLES
        vbpok_tab                         
= lt_hvbpok
        prot                              
= lt_prot.

 

So as you can see, one function module can help us in covering so many functionalities, all we need to do is know the right table that need to be passed !!

 

Good Luck!!

Three simple use of object oriented concepts in your daily work

$
0
0

Changing habits is a hard thing, especially if we are trying to change it for something which looks more complex but change is also inevitable for a developer if they want to adapt evolving environment. As a developer who wants to do things in the right way ABAP objects is introduced more than a decade ago but there are still some practical difficulties (or not ) to use it in our all daily work which can be found in several blogs like this or this. But In my opinion there are more positives than negatives  Which I  will try to explain.

 

Using below explained three methods, We will not only replace our way of working with object oriented equivalent which has some other advantages which will also be explained below, but also get more and more familiar with ABAP objects so we can use other fundamental object orientation approaches like inheritance and encapsulation.

 

  • Use local classes for internal modularization

When you check  F1 help for ABAP for a subroutine you will face the ugly truth that perform statement is obsolete where you can use some other little ugly features like tables statement. Even if you don’t have any intention to improve your knowledge on the subject, using local classes and methods in executable programs for modularization  is the only valid way since 2011.

 

Luckily using methods for modularization is not the only advantage, There is more strict syntax check while using  classes where we can not use things that are obsolete (  like header line for internal tables,  range statement etc) , by just using methods we will automatically avoid them, Anything obsolete is obsolete for a reason and we should never use them, but if you keep developing procedural way and do not read ABAP news or documentation and do not use quality check tools you may not even be aware that commands that you use is obsolete, you will avoid this just using methods instead of subroutines.

 

You can find a simple example at the end of my blog, ( or just search for using methods for modularization in SCN ).

 

 

  • Using global classes instead of function modules

It is almost the same approach with local class, to replace function modules,you can crate a global class and depending on your case create static or instance methods for modularization. As far as I know there is no class alternative for RFC  and Update modules so, to use Remote function calls it is still necessary to create function modules .

 

 

If you start  above approach for modularization first of all you will get familiar with OO aproach and its syntax standards,  and start to have well structured reusable developments if you also follow good modularization standards as its suggested in ABAP help documents. Having this well-structured  programs with reusable components will help you a lot in your Object Oriented programming journey if you wish to proceed further that way.

 

  • SALV

SALV  ( Global class CL_SALV_TABLE for example  is different approach to have same ALV output  SALV) which came with software component 640 is better than previous ALV tools and designed really with OO approach. By analyzing how it is created and using it we can get the idea how our developments should be like if we are really developing something object oriented.

 

All different attributes are created in separate class, for example if you need to change a column  or use events  you need to reference relevant class cl_salv_columns_table or cl_salv_events_table. We wouldn’t be just using this also we should analyze and use its model. After you get familiar with it, you will see that it is the simplest of all ALV tools. To find all kinds of SALV examples just search "SALV_DEMO*" programs via SE38.

 

 

Above three practice can be good starting point for people who want to deal with daily programming tasks in object oriented way and next step should be improving  modularization approach. When you had experience on all above and focus on having well structured, reusable, single purpose method’s in your developments. It will be easier for you to go one step forward to make complete OO design.

 

Example ( Using local class for modularization).

Definiton

Capture.PNG

Global reference

Capture.PNG

 

Instance creation


Capture.PNG

 

Use of methods

Capture.PNG

 

Capture.PNG

Using Categories in the ABAP development space

$
0
0

The ABAP development space contains more than 370 000 documents, blogs or other contributions! Formerly we introduced subspaces to organize the content. Despite the obvious advantage to establish some order in the space, there are some disadvantages. Quite often the content does not belong to exactly one topic, but you would like to assign it to two or more spaces equally. To overcome these disadvantages we decided to use the Category feature, SCN is offering.

The first two categories we introduced are:

  • ABAP Development
  • ABAP Trials and Developer Editions

 

Searching for Categories

 

Open the content tab in the ABAP development space. The categories figure on the left hand side.

Categories.PNG

 

Just click on a category link and the content is filtered accordingly.

 

Categorizing Content

 

As a contributor you would like to categorize your content. Just create it as usual.In the lower part you find the section Categories and the list of all available categories. Just mark the relevant ones.

 

Flagging.PNG

 

When to use the Category ABAP Development?

 

Use this category for ABAP language related contributions to distinguish it from the complete space.

 

When to use the Category ABAP Trials and Developer Editions?

This category is reserved for all content related to the ABAP trials and developer editions in the SAP Cloud Appliance Library and the ABAP download systems.

 

What about the Existing Content?

The existing content will not be categorized by us: more than 370.000 documents – nobody would accept this slavery. But you can categorize your existing content if you feel the need.

 

Will there be New Categories?

 

We are not analyzing existing content to come up with a complete classification system. The categories will grow on demand. So if you would like a new category to be introduced notify one of the space editors.

 

What is the Difference to Tags?

 

Tags are not bound to a space. Categories belong to a space. You can create your own tags but not your categories.

 

 

For more information, see this thread .

Delte entrys from table with variabel table name

$
0
0

Hello guys,

 

Maybe thats a repost, but i couldn't find something like this or maybe it's too easy for most of you.

My problem was while develping a report, I had to fill several tables. So i needed a report to reset my Z-tables (delete all entries). It's an easy lession if you use just one table in a report or clear all your table with hardcoding. But i wanted to create a programm which i can use for every project and every Z-Table.

 

Please beware to use this on your own risk. Deleteing data from a table is always sensitiv issue, so don't use this one if you dont know what you exacly doing.

 

There is a parameter where you can enter the name of the table which you want so empty and two radiobuttons. The radio buttons are for  showing the data which should be deleted and for the deleteing itself.

 

I'll tried to add a lot of commands, but i you have questions fell free to ask.

 

 

DATA: ref_tab TYPE REF TO DATA.                         "Generic data referenc
DATA: lv_char TYPE C.                                   "First letter to test if you want to delte a Z Table
DATA: l_tabname TYPE dd02l-tabname VALUE 'z_testtable'. "Name of the Z Table
DATA: lv_lines TYPE I.                                  "line count
DATA: gr_table TYPE REF TO cl_salv_table.               "For preview, so you can see waht you want to delte

FIELD-SYMBOLS: <fs> TYPE TABLE.                         "Fieldsymbole, which will get assigned late with the type of the entered table.

PARAMETERS: p_dbname TYPE string DEFAULT 'z_testtable'. "Name of the table you want to delete
PARAMETERS: rb_test TYPE flag RADIOBUTTON GROUP grp1 DEFAULT 'X'. "testrun
PARAMETERS: rb_del TYPE flag RADIOBUTTON GROUP grp1.              "Every Entry in this table will be !!!DELETED!!!


try.
     CREATE DATA ref_tab TYPE TABLE OF (p_dbname).   "Create the table with the type of the entered table
   catch CX_SY_CREATE_DATA_ERROR.                    "Table name was not found.
     Message 'Not able to find structur' type 'E'.
   CLEANUP.
endtry.
ASSIGN ref_tab->* TO <fs>.                    "Asssigne structure

SELECT * FROM (p_dbname) INTO TABLE <fs>.     "Select data

WRITE p_dbname TO lv_char.                    "Write the first letter to the Character field
TRANSLATE  lv_char TO UPPER CASE.             "Transalte to upper case, so you can thest if it is a ZTable

"Now doning some checks before you are able to delete. You can insert and authorizations object, but that will go to far...
IF rb_del = 'X'.
   IF sy-uname <> 'HANI'.                      "<-- Oay this is my Username, you have to repalce it with your own
     MESSAGE 'Sorry, seems to be a programm your a not allowed to use!' TYPE 'E'.
     LEAVE PROGRAM.
   ENDIF.

   IF lv_char <> 'Z'.                            "<-- Now ill check if you deleteing a ZTable. You dont HAVE TO delete sap tables within this report.
     MESSAGE 'Naaahhh, we will not delete sap tables, only z tables.' TYPE 'E'.
     LEAVE PROGRAM.
   ENDIF.


   DESCRIBE TABLE  <fs> LINES lv_lines.        "Count how many lines you will deltete
   DELETE (p_dbname) FROM TABLE  <fs> .        "delete
   WRITE: 'Table: ', p_dbname , ' - ', lv_lines , ' Entries been delteted'. "Screen output.

elseif rb_test = 'X'.                         "Data will be show which should be deltete, copied from a sap-BC XY report
   CALL METHOD cl_salv_table=>factory
     IMPORTING
       r_salv_table = gr_table
     CHANGING
       t_table      = <fs>.
   gr_table->display( ).
endif.

Mass download from solution manager

$
0
0

Hi,

 

Recently I am involve with a project that use solution manager .

 

We have a need for "mass download" .

 

In the forum there was a mention of this screen:

screenshot_01.png

 

So I debug the code and found class cl_sa_doc_factory .

 

The cl_sa_doc_factory=>get_read_url returned URL string, this string can be used in cl_http_client .

 

Program R_EITAN_TEST_60_02 (attached) demonstrate the process:

 

The program received as SELECT-OPTIONS a list of "Logical document" .

screenshot_06.png

 

for each "Logical document"

 

- Verify the value .

- Use cl_sa_doc_factory .

- Use cl_http_client .

- Use OPEN DATASET dataset_name FOR OUTPUT IN BINARY MODE  .

 

The result shown using function module BAL_DSP_PROFILE_POPUP_GET

screenshot_07.png

 

And the files:

screenshot_08.png

 

Regards .

UML Class Diagram Export to XMI format with ABAP standard classes

$
0
0

Introduction

Did you ever have the situation that you are working on a customer system landscape were you wanted to export an UML class diagram for a package, but the necessary JNet was not installed? We had that situation several times now and it was always a big effort to install JNet or it was not done, because it was to much effort to add the configuration to the automated software installment process in the customer landscape.

 

Therefore we decided to write a little program which allows us to export UML class diagrams to an XMI format, which can be then used for an import to an UML tool. For us that is a big advantage for the documentation of the system functionality, because we must not create/adjust the class diagrams manually in the UML tool.

 

The problem ...

In the ABAP Workbench the functionality is available to display an UML Diagram (Context menu on package name -> Display - UML Class Diagram).

uml01.png

This functionality starts the report UML_CLASS_DIAGRAM which analysis (depending on the made settings on the selection screen) the classes/interfaces and tries to display them with JNet integrated in the SAP GUI.

In case JNet is not installed (not installed by default with SAP GUI) you get just a list of the analyzed objects, but not the diagram. Our main problem is that without JNet the report cannot export the diagram for further usage.

uml02.png

 

The current solution for us ...

We created a little report based on program UML_CLASS_DIAGRAM which uses the ABAP standard class CL_UML_CLASS_SCANNER, CL_UML_CLASS_DECOR_XMI and CL_UML_UTILITIES to export an UML Class Diagram to XMI format without JNet.

You find the sources attached to this post. Just create the report and add the text symbol texts and selection screen texts. But please be aware that we did not invest so much time and effort to make it stable for each situation . Please consider also that the program was implemented on a NW 7.40 SP08, so if you wanna use the program on a system with a lower release some easy changes have to be done.

 

So I can do following now.

 

  1. Start the program and enter a class (e.g. CL_UML_CLASS_SCANNER) which should be exported.
    uml03.png
  2. Choose the required XMI version.
    uml04.png
  3. Import XMI file to a supported UML tool.
    uml06.png

  4. Finished
    uml07.png

 

If you have any comments, please let me know.


Lines of Code Check with Code Inspector

$
0
0

 

Introduction

SAP delivers several metrics checks for the Code Inspector. These are for example

  • Number of Executable Statements
  • Procedural Metrics with Statements check
  • OO Size Metrics (number of methods, attributes ...)
  • ...

From my point of view checks which are counting statements have  the problem, that they do not say very much about the compliance of the coding with different principles like Single Responsibility Principle or the Separation of Concerns Principle described by Clean Code Development. For example, think about a function in which you call ten functions. A statement count would only recognize the ten CALL FUNCTION statements which would be ok, in case you have defined a limit of ten statements in the statement count check. But what about the purpose of that? Makes it sense to really call that ten functions from a maintenance point of view or would it make sense to structure the coding, that the functionality is more reusable? Another gap of the statement count is also, that you can define big interfaces in functions or methods which would also not be recognized. So if you define a function with 100 parameters and call it in another function the statement check would be ok, but from a clean code development perspective it is terrible.

 

So from my point of view a check for Lines of Code is more useful in most cases. Violations of defined Lines of Code metrics give some hints about the violations of Clean Code Development principles (e.g. Separation of Concerns). In some projects I have third party check tools which are able to check Lines of Code on processing block level. But in some projects I do not have these tools or are not allowed to use them. Therefore I decided to make a little Code Inspector check implementation which does the check for us. Our main goal was to check the length of methods and functions, therefore in the first version the check supports only these two kinds of objects.

 

Lines of Code check explained from a user point of view

The implemented Lines of Code check is available under the standard Code Inspector check category "Metrics and Statistics" if you define a Code Inspector check variant.

ci01.png

The check has following attributes which allow configure the check.

ci02.png

The attributes have following meaning and can be configured separate for methods and functions:

  • Message if more than ...: Here the number of lines can be defined which are ok for the object. An message is just reported if the object has more lines than the defined ones. In case no number is defined the check for the specific object is not executed.
  • Message Level: This attribute defines if the message should be reported as Error, Warning or Information/Note.
    • E = Error
    • W = Warning
    • N = Information/Note
  • Count start/end statement: Defines if the start and end statement is also counted (e.g. for a function the lines with FUNCTION and ENDFUNCTION would also be considered).
  • Count empty lines: Defines if empty lines are counted.
  • Count comments: Defines if comment lines are counted (lines beginning with a star or lines which do only have a comment started with an apostroph).

 

After an execution of an inspection with the check variant you get following messages in case of a found violation.

ci03.png

Of course the standard Code Inspector functions like navigation to the reported object are available here too.

 

How to implement the Lines of Code check and make it usable?

I do not want to describe it here how to create an own Code Inspector check from scratch, because Peter Inotai has already described it a long time ago in this blog Code Inspector - How to create a new check. I just wanna describe it, what steps you have to do, to be able to use the Lines of Code check in your environment.


  1. Create ABAP Class
    In the attached SAPlink Nugget file or the TXT files you find the sources for the Lines of Code check. Either you import the SAPlink Nugget with SAPlink or you create the coding manually by creating a class ZZ_CL_CI_LINES_OF_CODE (or name it whatever you want) and copy the coding of the TXT files to it. For the manual approach I would recommend to use ABAP in Eclipse or the source based view in the class builder (otherwise you have to copy each method manually). The local helper class coding should be copied to the "local types" area. After that you should be able to activate the class. That is all you have to do with code.
    Consider that the check was written on a NW 7.40 system. Therefore some of the new ABAP syntax (e.g. table expressions, NEW constructor expression) were used. If you wanna use the code on a system with a lower release, some little changes have to be made.

  2. "Register" ABAP Class for Code Inspector
    If you now start the Code Inspector transaction and define a new check variant you will not see the new check. To be able to see it you have to go to "Goto -> Management of -> Tests" in the Code Inspector transaction.
    ci04.png
    In the upcoming screen you should see the class you have created before. Just tick the flag in front of the class and the new check is then visible when you define a check variant.
    ci05.png

Integration with ABAP Test Cockpit

A great advantage of the Code Inspector checks is that they are used by the ABAP Test Cockpit. So in the ABAP Workbench and in the ABAP Development Tools for Eclipse the new check can be used without further big effort. The only thing to be done is that you define the Code Inspector check variant which has to be used by the ABAP Test Cockpit runs. This can be done by

  • defining your check variant as global default variant in transaction ATC.
  • defining the check variant each time you execute the ABAP Test Cockpit run by "Check -> Check with ABAP Test Cockpit (ATC) with" in the ABAP workbench
  • defining the check variant in the properties of an ABAP project in the ABAP Development Tools for Eclipse
    ci06.png

So you get the results of your check directly after executing the ATC checks. Here is an result example of the new in the new ATC problems view within Eclipse.

ci07.png

 

Conclusion

The implemented check is a nice helper for me to have an immediate feedback about my code size. Next steps are to enhance the check so that is supports more objects (e.g. forms because of legacy reasons, whole classes, programs). If you have any comments or questions please let me know.

News about mockA

$
0
0

The current release of mockA is available at Github. It contains an error fix that I would like to outline in today´s blog post.

 

The bug

MockA allows you to mock classes as described in one of my previous blog posts. Technically, mockA tries to create a subclass of the class which is subject to the mock creation. This means, it will only work, if the class is not marked as final and has a constructor which is at least protected or public.

MockA overrides methods that should be mocked, with a local implementation that returns the values expected to be returned. It follows the specifications set up by the unit test, according to the method( ), with( ) and exports( ) or returns( ) -calls (and so on) during mock creation.

 

There is another feature that reuses generated subroutine pools that have been created by mockA, because the Web Application Server ABAP allows only about 36 subroutine pools for each program, or, in our case, per unit test. The generated code does not contain any hard coded method output parameters as there would be no benefit in buffering the generated coding then. Instead, the instance of type ZIF_MOCKA_MOCKER is passed to the mock object. In the mock object´s method implementations, the fake values are read from that instance. If a new mock object should be created, a new instance of ZIF_MOCKA_MOCKER will be passed to the mock object. Hence, the method output may change.

 

If mockA generates the local implementation of an interface, each method is implemented during the initial mock class generation, so this feature poses no issues here.

However, in case a class needs to be mocked, mockA also tried to reuse these generated subroutine pools in the past. Do you see the error?

 

What could possibly happen

Imagine, mockA should create a mock implementation of the following class: ZCL_I_CAUSE_TROUBLE which has two methods:

  • METHOD_A
  • METHOD_B

  In our first unit test, we will tell mockA to simulate the output of METHOD_A, without defining any output for METHOD_B. 

When the mock object is created, mockA will generate a local subclass of ZCL_I_WILL_CAUSE_TROUBLE, with a local implementation of method METHOD_A that overrides the parent´s class method. The parent´s class method cannot be called any longer via the mock object. METHOD_B_ remains untouched.

After generation of the subroutine pool, the class implementation is buffered for later usage.

 

If another unit test, that is executed after the first one, wants to control the output of METHOD_B, mockA won´t return that output, as the method has not been overridden in the first unit test and therefore no output control takes place in the locally created class implementation: The logic that is responsible for returning the specified fake values is simply not called. Instead, the super implementation of ZCL_I_WILL_CAUSE_TROUBLE is called.

 

The solution

Subroutine pool buffering is now generally switched off if a class needs to be simulated. For interfaces, the current logic remains unchanged.

Unfortunately, this change is a breaking change, which can lead to failing unit tests, which have passed in the past.

This change could cause some new issues that I would like to outline briefly:

  1. Subroutine pool limits might be violated for existing unit tests. As there might be multiple implementations generated per class within the same unit test report, the subroutine pool limit might be violated once you updated mockA. In this case, please split up your test methods into various reports, if possible.
  2. Please see the example above: If your second unit test tells mockA to simulate the output of METHOD_B, but actually expects a result that is returned by the super implementation, your unit test might fail now, as the super implementation is not called any longer due to the correction and instead, the specified fake values will be returned.
    I know that this is just a theoretical consideration but important to be mentioned. Nevertheless, these test cases can be considered incorrectly implemented, as the unit test possibly expects other values than the values that have been defined as output for METHOD_B. Hence, these test cases should be reviewed anyway!

 

Feedback welcomed

There is no possibility to switch off the currently implemented behaviour of mockA as I think it is more important to fix the error than to allow old and incorrect unit test implementations not to fail.

 

Please let me know, if you run into trouble with the new update, and if issue 1) or 2) applies, or maybe both. Please also tell me, if you figured out another issue that I didn´t think of now.

Static ABAP Code Analysis using ConQAT

$
0
0

 

Introduction

 

In the following sections I wanna give an overview about the usage of ConQAT as static code analysis tool from an enduser point of view. I wanna explain why I use an additional tool and what information I get from it.

 

Why an additional tool for static code analysis?

With the Code Inspector (SCI) and the ABAP Test Cockpit (ATC) SAP already provides powerful tools for the static code analysis. I am using these tools (especially SCI) already for years, but there are some usability and functionality gaps which can be closed using ConQAT. Examples for that gaps are:

  • Usable visualization of results (with text and graphics).
  • Baseline mechanism (define a code baseline for which no remarks are reported, e.g. in case of maintenance projects).
  • Historical view on check results (how the code quality increases/decreases over the time).
  • Check for redundant coding.

 

What is ConQAT?

ConQAT (Continuous Quality Assessment Toolkit) is a software quality analysis engine which can be freely configured cause of using the pipes and filters architecture. For a detailed description have a look to ConQAT - Wikipedia. Some key points I wanna take out are:

  • Configuration of analysis via Eclipse (using an Eclipse plugin).
  • Support of different languages (e.g. Java, C/C++/C#, ABAP). Due to the flexible architecture of the scan engine it can be enhanced for any language. So also for example SQLScript, which comes more and more in the focus for us.
  • Integration of results from other tools (e.g. SCI, FindBug).

 

How is ConQAT used in our ABAP projects?

In our ABAP projects ConQAT is used in the following way:

  • It is configured to analyze the coding two times a day. This means that the coding of the to be analyzed packages is extracted and analyzed by the ConQAT engine. This process also starts a configured SCI variant. The results of the SCI run are also extracted and considered by ConQAT. From my point of view I would prefer a higher frequency of analysis runs, but at the moment this is not possible within our landscape. In the future this problem will be solved by the successor of ConQAT (but more on that in the Outlook section).
  • The results of ConQAT (with the integrated SCI results) are provided as a HTML Dashboard. On the dashboard an overview section gives a first insight to the results. Within the specific sections different detailed data regarding the analysis can be found. In the dashboard the developer can navigate also down to code level (displayed in the browser) where the remarks are marked at line level. Via an integration of the ADT Links the developer can directly jump out of the dashboard to the coding in Eclipse to edit it.

 

ConQAT General Information

ConQAT provides the following general information in the result output. In the following chapters I show just the graphical output of a demo analysis, but of course there is also a text output for the objects for which remarks exist.

 

Overview

The overview page gives an overview about the metrics. It displays how many remarks I have in the whole system and it displays how many remarks I have in the delta to a defined baseline.

01_general_information__overview.png

 

Architecture Specification

With ConQAT it is possible to define an architecture specification. It describes which objects can be used by which other objects (e.g. so it can be defined, that the UI layer cannot directly use objects from the data access layer). The relationships can be defined from package level down to single object level. From an ABAP point of view this than be compared to the ABAP package interfaces. The following figure displays a specification which defines the relationships on package level.

01b_architecture_specification.png

 

Treemap Outline

The Treemap Outline gives displays the analyzed ABAP packages. If the developer hovers with the mouse over a package he gets more information about e.g. the package size (lines of code).

02_general_information__treemap_outline.png

 

System Size Trend

On the System Size Trend page it can be identifed how the system size grows over time. It is also visible how many lines of code are generated and how many are coded manually (generated objects can be marked in the configuration).

 

LoC = All Lines of Code (manual, generated, comments)

SLOC = Manual Lines of Code without comment lines

LoCM = Manual Lines of Code with comment lines

LoCG = Generated Lines of Code

03_general_information__system_size_trend_01.png

03_general_information__system_size_trend_02.png

 

Modified Source

The modified source code is also visualized using treemaps. So in an easy way it can be found out where the changes were done in the system (added/changed/removed coding).

01_modified_source.png

 

Task Tags

ConQAT also can be configured to report task tags (e.g. TODO, FIXME).

05_general_information__task_tags.png

 

ConQAT Code Metrics

 

Architecture Violations

Violations of the defined architecture (see section above) are displayed in the same graphical way as the architecture specification itself. In addition to the "green" arrows displaying the allowed relations, violations are displayed as "red" arrows.

 

Clone Coverage

ConQAT analyzes clones within the coding. This is not just a search for exactly the same code parts. The algorithm considers same code structures.

This check helps to detect coding which can be encapsulated in reusable functionalities and it helps also to detect "copy & paste" coding which will lead in most cases to error situations when in later versions not all places are adjusted (because of e.g. a defect, an enhancement). From a Clean Code Development perspective it helps to avoid violations of the "Don't Repeat Yourself" (DRY) principle.


In case the information provided in the dashboard is not enough (even not on code level), ConQAT allows to compare clones with the help of an Eclipse plugin in detail.

 

01a_clone_coverage.png

01b_clone_coverage.png

 

Long Programs

ConQAT allows to check for "long programs"; classes, programs, ... which have more lines of code than defined in the configuration. To long classes for example are in most cases an evidence that the Single Responsibility principle is violated.

As in the following figure can be seen, it was configured that e.g. classes up to 600 lines of code are ok (green). Objects with up to 1500 lines of code have to be checked (yellow). More than 1500 lines of code are not ok.

For "lines of code metrics" (like Long Programs and Long Procedures as described in next section) it can be configured if comment lines are considered or not (by default they are excluded). Empty lines are ignored in general.

02_long_programs.png

 

Long Procedures

The long procedures metric checks methods, functions, ... regarding their lines of code length. Violations for that metric gives us an evidence that to much is done in e.g. one method which has to be extracted in more granular reusable code blocks. The following configuration defines that up to 60 lines of code are ok, up to 150 lines the object has to be checked. All objects with more than 150 lines of code are not ok.

 

02_long_procedures.png

 

Deep Nesting

Deep nesting is a classical metric which is also checked by ConQAT. Coding is identified which will be to complex to read and to understand because of very deep nestings.

 

Our configuration allows up to 5 deep nesting levels (which is already a high number). Up to 7 it the coding has to be checked. More than 7 it is not allowed.

 

03_deep_nesting.png

 

integrated SCI results

As mentioned before, ConQAT allows to integrate the results of SCI check runs. It can be defined which check results are marked as critical warnings or as guideline violations. The integration of the SCI results in the ConQAT Dashboard has the advantage, that not several different places have to be checked for remarks and that the results are also provided in a graphical way which gives a better overview. And of course I see directly what has been changed over time.

 

Further features

In the previous chapters I gave a general overview about the ConQAT features at a high level. The following features were partly already mentioned in these chapters, but I wanna make some further explanations to them to turn out these functionalities.

 

Baseline Mechanism

ConQAT supports code baselines. That means that you can define a code baseline for which no remarks should be reported in a delta comparsion.

 

Depending on your project following quality goals are possible:

  • No remarks: No remarks in general. That can be applied for new projects, were you start from scratch. But in case a maintenance project is taken over in most cases that quality goal cannot be applied, because the problems are integrated to deep in the system which would lead to additional implementation and tests effort if the problems should be solved (and as we all know, no one wants to pay for such things).
  • No new remarks in changed objects: In changed objects no new/additional remarks are introduced.
  • No remarks in changed objects: In changed objects no new/additional remarks are introduced and all existing remarks are solved.

 

Regarding the quality goals "No new remarks in changed objects" and "No remarks in changed objects" the baseline definition helps us to compare what was already there and what is new.

 

ConQAT analyzes the complete coding, reports the remarks for the whole system, but returns also just the delta compared to the baseline (if configured).

 

 

Detailed code analysis in browser

In the HTML Dashboard the developer can navigate down to code level. On a single object level he sees all remarks for the object on the top. When he scrolls through the coding he sees the remarks also by a marker on the left side. So without entering the system, the problems can already be analyzed in detail.

 

01a_code.png

01b_code.png

 

Integration with ABAP in Eclipse

With the ADT Tools so called ADT Links were introduced. Links which can open ABAP objects directly in Eclipse. This ADT Link feature is integrated in the dashboard. So a developer does not need to copy & paste the object name if he wants to edit it. He just has to click on the link to open the object directly in Eclipse ready for editing.

02_adt.png

 

Blacklisting

Not every remark of ConQAT must be a valid remark (cause of different reasons). For that ConQAT supports blacklisting of remarks, so that the remarks are ignored in the further analysis runs.

 

Conclusion & Outlook

As you have seen, ConQAT is a powerful tool for static code analysis which gives a better overview over the systems code quality. With the integration of the SCI results you have the option to define one single place where all check results can be found and analyzed. Due to the flexible architecture ConQAT allows also that further languages can be analyzed which are in focus in the SAP development context (e.g. SQLScript or JavaScript).

At the moment the only thing I do not really like is that the code analysis only runs twice a day due to our configuration.

 

Finally I can say, that the code quality has made a big step forward since I am using ConQAT.

Releasing Internal Table Memory

$
0
0


It is a well known fact, that you release the memory occupied by an internal table using either CLEAR or FREE, where FREE releases also the initial memory area. You normally use CLEAR, if you want to reuse the table and you use FREE, if you  really want to get rid of it and don't want to refill it later on. Assigning an initial internal table to a filled internal table does also release the target table's memory in the same way as CLEAR does.

 

But last week a colleague pointed out to me that it is not such a well known fact that deleting lines of internal tables with DELETE normally does not release the memory occupied by the deleted lines. Instead, there seem to be people deleting lines of internal tables in order to release memory. Therefore as a rule:

 

Deleting lines of an internal table using the DELETE stament does not release the table's memory.

 

For an internal table that was filled and where all lines are deleted using the DELETE statement the predicate IS INITIAL in fact is true. But the internal table is only initial regarding the number of lines but not regarding the memory occupied. You can check that easily using the memory analysis tools of the ABAP debugger.

 

So far so good. For releasing the memory of an internal table you use CLEAR or FREE and you do not simply DELETE all lines.

 

But what about the use case, where you want to delete almost all lines from a big internal table and to keep the rest? After deleting, the internal table occupies much more memory than needed for its actual lines. If memory consumption is critical, you might want to get rid of the superfluous memory occupied by such an internal table. How to do that?

 

Spontaneous idea:

 

DELETE almost_all_lines_of_itab.


DATA buffer_tab LIKE itab.
buffer_tab = itab.
CLEAR itab.
itab =  buffer_tab.
CLEAR buffer_tab.

 

Bad idea! Check it in the ABAP Debugger. Due to table sharing, after assigning itab to buffer_tab, buffer_tab is pointing to the same memory area as itab. Assigning buffer_tab back to itab after clearing itab is simply an effectles roundtrip and you gain nothing.

 

Improved idea:

 

DELETE almost_all_lines_of_itab.


DATA buffer_tab LIKE itab.
buffer_tab = VALUE #( ( LINES OF itab ) ).
CLEAR itab.
itab =  buffer_tab.
CLEAR buffer_tab.

 

Now it works! Instead of copying itab to buffer_tab you can transfer the lines of itab sequentially to the initial target table and the memory is not shared. Before 7.40, SP08 you have to use INSERT LINES OF itab INTO TABLE buffer_tab instead of the VALUE expression, of course.

 

What also works for the use case is:

 

DELETE almost_all_lines_of_itab.


DATA buffer_tab LIKE itab.
buffer_tab = itab.
INSERT dummy_line INTO TABLE buffer_tab.
DELETE buffer_tab WHERE table_line = dummy_line.
CLEAR itab.
itab =  buffer_tab.
CLEAR buffer_tab.

 

By inserting a dummy line into buffer_tab and deleting it again, the table sharing is canceled and buffer_tab is built from scratch (but only, if it needs considerably less memory than before; otherwise it is copied and nothing is gained again).

 

Ingenious minds might also find the following ways:

 

DELETE almost_all_lines_of_itab.


DATA buffer_string TYPE xstring.
EXPORT itab TO DATA BUFFER buffer_string.
CLEAR itab.
IMPORT itab FROM DATA BUFFER buffer_string.
CLEAR buffer_string.

 

or even

 

DELETE almost_all_lines_of_itab.


CALL TRANSFORMATION id SOURCE itab = itab
                       RESULT XML DATA(buffer_string).
CLEAR itab.
CALL TRANSFORMATION id SOURCE XML buffer_string
                       RESULT itab = itab.

CLEAR buffer_string.

 

Yes, those work too, but put some GET RUN TIME FIELD statements around them to see that those are not the best ideas ...

Create a Formatted Excel in a Background Job

$
0
0
related page 1
related page 2
related page 3


NOTE: Before beginning, the XLSX Workbench functionality must be available in the your system.

 

Suppose we need to generate Excel-file in the background mode.

For ease of example, lets create form, that contains only the classical phrase "Hello world !" nested in the rectangte area. The resultant Excel-file we will send via SAP-mail (in this case - to themselves).

 

1 PREPARE A PRINTING PROGRAM.

 

As you can see, most of the code takes the mailing (does not apply to the form creation) :

 

REPORT  z_hello_world .

 

* declare and fill context

DATA gs_context TYPE lvc_s_tabl .

DATA gv_document_rawdata  TYPE mime_data .

gs_context-value = 'Hello world!' .

 

* call the form

CALLFUNCTION'ZXLWB_CALLFORM'

   EXPORTING

     iv_formname    = 'HELLO_WORLD'

     iv_context_ref = gs_context

     iv_viewer_suppress  = 'X'

   IMPORTING

     ev_document_rawdata = gv_document_rawdata

   EXCEPTIONS

     OTHERS         = 2 .

IF sy-subrc NE0 .

   MESSAGEID sy-msgid TYPE sy-msgty NUMBER sy-msgno

           WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4 .

ENDIF .

 

* mailing

PERFORM send_mail USING gv_document_rawdata .

 

*&---------------------------------------------------------------------*

*&      Form  send_mail

*&---------------------------------------------------------------------*

FORM send_mail USING pv_document_rawdata TYPE mime_data .

   DATA:

     lv_attachment_size  TYPE sood-objlen ,

     lv_subject          TYPE so_obj_des ,

     lv_document_size    TYPEi ,

     lt_document_table   TYPE solix_tab .

   DATA:

     lr_send_request     TYPE REF TO cl_bcs ,

     lr_mail_message     TYPE REF TO cl_document_bcs ,

     lr_recipient        TYPE REF TO if_recipient_bcs ,

     lr_error            TYPE REF TO i_oi_error ,

     ls_retcode          TYPE soi_ret_string ,

     lv_attachment_type  TYPE soodk-objtp VALUE'XLS' .

 

   CALLFUNCTION'SCMS_XSTRING_TO_BINARY'

     EXPORTING

       buffer        = pv_document_rawdata

     IMPORTING

       output_length = lv_document_size

     TABLES

       binary_tab    = lt_document_table.

 

   lr_send_request = cl_bcs=>create_persistent( ) .

 

   lv_subject = 'test mail' .

   lr_mail_message = cl_document_bcs=>create_document(

       i_type      = 'RAW'

       i_subject   = lv_subject ) .

 

   lv_attachment_size = lv_document_size .

   TRY .

       lr_mail_message->add_attachment(

           i_attachment_type     = lv_attachment_type

           i_attachment_subject  = space

           i_attachment_size     = lv_attachment_size

           i_att_content_hex     = lt_document_table ) .

     CATCH cx_document_bcs .

   ENDTRY .

   lr_send_request->set_document( lr_mail_message ) .

 

   lr_recipient = cl_sapuser_bcs=>create( sy-uname ).

 

   lr_send_request->set_send_immediately( abap_on ) .

 

   lr_send_request->add_recipient(

       i_recipient = lr_recipient

       i_express   = abap_on ) .

 

   lr_send_request->send( i_with_error_screen = abap_on ) .

 

   COMMITWORK .

ENDFORM .                    "send_mail


2 PREPARE A FORM.

 

2.1 Launch XLSX Workbench, and in the popup window specify a form name HELLO_WORLD , and then press the button «Process»:

 

 

 

Empty form will be displayed:

 

123.PNG

2.2 Push button444_19_2.PNGto save the form.

 

 

2.3 Assign context LVC_S_TABL to the form:


 

 

Herewith, you will be prompted to create a form's structure automatically (based on context):

00_6_3.PNG

Let's press the button: .

 

As result,  «Pattern» () and «Value» () will be added under the «Sheet» in the form structure tree :

 

124.PNG

 

Added components will already have a binding with context. For this components, only template binding is required.

We'll do it later, but first we perform markup of template.

 

 

 

2.4 Make markup in the Excel template:


 

 

 


2.5 Template binding:


Assign «Pattern» to a target area in the Excel-template; For assigning, You have to perform next steps successively:

 

  • Pose cursor on the node in the form's structure tree;
  • Select a cell range [A1 : C3] in the Excel-template;
  • Press a button located in the item «Area in the template» of the Properties tab:

 

 

 

 

Similary, assign «Value» to a target area in the Excel-template; For assigning, You have to perform next steps successively:
  • Pose cursor on the node in the form's structure tree;
  • Select a cell range [B2] in the Excel-template;
  • Press a button located in the item «Area in the template» of the Properties tab:

 

 

 

Scheme of bindings:

 

2.6 Activate form by pressing button444_30.PNG.

 

 

3 EXECUTION.


Launch SE38 and run your report Z_HELLO_WORLD in background mode :


 


125.PNG


 

 

 

126.PNG

 

 

Viewing all 943 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>