Help us improve your experience.

Let us know what you think.

Do you have time for a two-minute survey?

Navigation
Guide That Contains This Content
[+] Expand All
[-] Collapse All

    IBM DB2

    JSA has two options for integrating events from IBM®DB2®®.

    See the following topics:

    Integration Of IBM DB2 with LEEF Events

    The IBM®DB2® DSM allows the integration of DB2® events in LEEF format from an IBM z/OS® mainframe by using IBM® Security zSecure®.

    Using a zSecure process, events from the System Management Facilities (SMF) are recorded to an event file in the Log Enhanced Event format (LEEF). JSA retrieves the LEEF event log files by using the log file protocol and processes the events. You can schedule JSA to retrieve events on a polling interval.

    To integrate IBM®DB2® events:

    1. Confirm that your installation meets any prerequisite installation requirements. For more information, see Before You Begin.

    2. Configure your IBM® DB2® image to write events in LEEF format. For more information, see the IBM® Security zSecure Suite: CARLa-Driven Components Installation and Deployment Guide.

    3. Create a log source in JSA for IBM® DB2® to retrieve your LEEF formatted event logs. For more information, see Creating a log source for IBM DB2A log file protocol source allows JSA to retrieve archived log files from a remote host..

    4. Optional. Create a custom event property for IBM® DB2® in JSA. For more information, see the JSA Custom Event Properties for IBM z/OS technical note.

    Before You Begin

    Before you can configure the data collection process, you must complete the basic zSecure installation process.

    The following prerequisites are required:

    • You must ensure parmlib member IFAPRDxx is enabled for IBM® Security zSecure Audit on your IBM® DB2® z/OS® image.

    • The SCKRLOAD library must be APF-authorized.

    • You must configure a process to periodically refresh your CKFREEZE and UNLOAD data sets.

    • You must configure an SFTP, FTP, or SCP server on your z/OS® image for JSA to download your LEEF event files.

    • You must allow SFTP, FTP, or SCP traffic on firewalls that are located between JSA and your z/OS® image.

    Following the software installation, you must complete the postinstallation activities to create and modify the configuration. For instructions on installing and configuring zSecure, see the IBM® Security zSecure Suite: CARLa-Driven Components Installation and Deployment Guide.

    Creating a Log Source for IBM DB2

    A log file protocol source allows JSA to retrieve archived log files from a remote host.

    The IBM®DB2® DSM supports the bulk loading of log files by using the log file protocol source. When you configure your IBM®DB2® to use the log file protocol, make sure the host name or IP address that is configured in the IBM®DB2® system is the same as that configured in the Remote Host parameter in the log file protocol configuration.

    1. Log in to JSA.
    2. Click the Admin tab.
    3. Click the Log Sources icon.
    4. Click Add.
    5. In the Log Source Name field, type a name for the log source.
    6. In the Log Source Description field, type a description for the log source.
    7. From the Log Source Type list, select IBM® DB2®.
    8. From the Protocol Configuration list, select Log File.
    9. Configure the following values:

      Table 1: IBM DB2 Log File Protocol Parameters

      Parameter

      Description

      Log Source Identifier

      Type an IP address, host name, or name to identify the event source. Using IP addresses or host names is suggested as they allow JSA to identify a log file to a unique event source.

      For example, if your network contains multiple devices, such as multiple z/OS® images or a file repository that contains all of your event logs, specify a name, IP address, or host name for the image or location that uniquely identifies events for the IBM®DB2® log source. This address specification allows events to be identified at the image or location level in your network that your users can identify.

      Service Type

      From the list, select the protocol that you want to use when retrieving log files from a remote server. The default is SFTP.

      • SFTP SSH File Transfer Protocol

      • FTP File Transfer Protocol

      • SCP Secure Copy

      The underlying protocol that is used to retrieve log files for the SCP and SFTP service type requires that the server specified in the Remote IP or Hostname field has the SFTP subsystem enabled.

      Remote IP or Hostname

      Type the IP address or host name of the device that stores your event log files.

      Remote Port

      Type the TCP port on the remote host that is running the selected Service Type. The valid range is 1 - 65535.

      The options include the following ports:

      • FTP TCP Port 21

      • SFTP TCP Port 22

      • SCP TCP Port 22

      If the host for your event files is using a non-standard port number for FTP, SFTP, or SCP, you must adjust the port value.

      Remote User

      Type the user name necessary to log in to the host that contains your event files.

      The user name can be up to 255 characters in length.

      Remote Password

      Type the password necessary to log in to the host.

      Confirm Password

      Confirm the password necessary to log in to the host.

      SSH Key File

      If you select SCP or SFTP as the Service Type, this parameter gives the option to define an SSH private key file. When you provide an SSH Key File, the Remote Password field is ignored.

      Remote Directory

      Type the directory location on the remote host from which the files are retrieved, relative to the user account you are using to log in.

      For FTP only. If your log files are in the remote user's home directory, you can leave the remote directory blank. This option gives support to operating systems where a change in the working directory (CWD) command is restricted.

      Recursive

      Select this check box if you want the file pattern to search sub folders in the remote directory. By default, the check box is clear.

      The Recursive option is ignored if you configure SCP as the Service Type.

      FTP File Pattern

      If you select SFTP or FTP as the Service Type, this option allows the configuration of the regular expression (regex) required to filter the list of files that are specified in the Remote Directory. All matching files are included in the processing.

      The FTP file pattern that you specify must match the name that you assigned to your event files. For example, to collect comma-delimited files that end with .del, type the following code:

      .*.del

      Use of this parameter requires knowledge of regular expressions (regex). For more information, see the following website: http://download.oracle.com/javase/tutorial/essential/regex/

      FTP Transfer Mode

      From the list, select ASCII for comma-delimited, text, or ASCII log sources that require an ASCII FTP file transfer mode.

      This option displays only if you select FTP as the Service Type.

      SCP Remote File

      If you select SCP as the Service Type you must type the file name of the remote file.

      Start Time

      Type the time of day you want the processing to begin. For example, type 00:00 to schedule the log file protocol to collect event files at midnight.

      This parameter functions with the Recurrence value to establish when and how often the Remote Directory is scanned for files. Type the start time, based on a 24-hour clock, in the following format: HH: MM.

      Recurrence

      Type the frequency, beginning at the Start Time, that you want the remote directory to be scanned. Type this value in hours (H), minutes (M), or days (D).

      For example, type 2H if you want the remote directory to be scanned every 2 hours from the start time. The default is 1H.

      Run On Save

      Select this check box if you want the log file protocol to run immediately after you click Save.

      After the Run On Save completes, the log file protocol follows your configured start time and recurrence schedule.

      Selecting Run On Save clears the list of previously processed files for the Ignore Previously Processed File parameter.

      EPS Throttle

      Type the number of Events Per Second (EPS) that you do not want this protocol to exceed. The valid range is 100 - 5000.

      Processor

      From the list, select None.

      Processors allow event file archives to be expanded and the contents to be processed for events. Files are only processed after they are downloaded to JSA. JSA can process files in zip, gzip, tar, or tar+gzip archive format.

      Ignore Previously Processed File(s)

      Select this check box to track and ignore files that are already processed by the log file protocol.

      JSA examines the log files in the remote directory to determine if a file is previously processed by the log file protocol. If a previously processed file is detected, the log file protocol does not download the file for processing. All files that are not previously processed are downloaded.

      This option applies only to FTP and SFTP Service Types.

      Change Local Directory?

      Select this check box to define a local directory on your JSA for storing downloaded files during processing.

      It is suggested that you leave this check box clear. When this check box is selected, the Local Directory field is displayed, which gives the option to configure the local directory to use for storing files.

      Event Generator

      From the Event Generator list, select LineByLine.

      The Event Generator applies more processing to the retrieved event files. Each line of the file is a single event. For example, if a file has 10 lines of text, 10 separate events are created.

    10. Click Save.
    11. On the Admin tab, click Deploy Changes.

    Integrating IBM DB2 Audit Events

    The IBM®DB2® DSM allows you to integrate your DB2® audit logs into JSA for analysis.

    The db2audit command creates a set of comma-delimited text files with a .del extension that defines the scope of audit data for JSA when auditing is configured and enabled. Comma-delimited files created by the db2audit command include:

    • audit.del

    • checking.del

    • context.del

    • execute.del

    • objmaint.del

    • secmaint.del

    • sysadmin.del

    • validate.del

    To integrate the IBM®DB2® DSM with JSA, you must:

    1. Use the db2audit command to ensure the IBM® DB2® records security events. See your IBM® DB2® vendor documentation for more information.

    2. Extract the DB2® audit data of events contained in the instance to a log file, depending on your version of IBM® DB2®:

      If you are using DB2® v9.5 and later, see Extracting audit data: DB2 v9.5 and laterYou can extract audit data when you are using IBM DB2 v9.5 and later.,

      or

      If you are using DB2® v8.x to v9.4, see Extract audit data: DB2 v8.x to v9.4You can extract audit data when you are using IBM DB2 v8.x to v9.4.

    3. Use the log file protocol source to pull the output instance log file and send that information back to JSA on a scheduled basis. JSA then imports and processes this file. See Creating a log source for IBM DB2A log file protocol source allows JSA to retrieve archived log files from a remote host..

    Note: The IBM® DB2® DSM does not support the IBM z/OS mainframe operating system.

    Extracting Audit Data: DB2 V9.5 and Later

    You can extract audit data when you are using IBM®DB2® v9.5 and later.

    1. Log in to a DB2® account with SYSADMIN privilege.
    2. Move the audit records from the database instance to the audit log:

      db2audit flush

      For example, the flush command response might resemble the following output:

      AUD00001 Operation succeeded.

    3. Archive and move the active instance to a new location for future extraction:

      db2audit archive

      For example, an archive command response might resemble the following output:

      Node AUD Archived or Interim Log File Message ---- --- ----------------------------- - 0 AUD00001 dbsaudit.instance.log.0.20091217125028 AUD00001 Operation succeeded.

      Note: In DB2® v9.5 and later, the archive command replaces the prune command.

      The archive command moves the active audit log to a new location, effectively pruning all non-active records from the log. An archive command must be complete before an extract can be executed.

    4. Extract the data from the archived audit log and write the data to .del files:

      db2audit extract delasc from files db2audit.instance.log.0.200912171528

      For example, an archive command response might resemble the following output:

      AUD00001 Operation succeeded.

      Note: Double-quotation marks (") are used as the default text delimiter in the ASCII files, do not change the delimiter.

    5. Move the .del files to a storage location where JSA can pull the file. The movement of the comma-delimited (.del) files should be synchronized with the file pull interval in JSA.

      You are now ready to configure JSA to receive DB2® log files. See Creating a log source for IBM DB2A log file protocol source allows JSA to retrieve archived log files from a remote host..

    Extract Audit Data: DB2 V8.x to V9.4

    You can extract audit data when you are using IBM®DB2® v8.x to v9.4.

    1. Log into a DB2® account with SYSADMIN privilege.
    2. Type the following start command to audit a database instance:

      db2audit start

      For example, the start command response might resemble the following output:

      AUD00001 Operation succeeded.

    3. Move the audit records from the instance to the audit log:

      db2audit flush

      For example, the flush command response might resemble the following output:

      AUD00001 Operation succeeded.

    4. Extract the data from the archived audit log and write the data to .del files:

      db2audit extract delasc

      For example, an archive command response might resemble the following output:

      AUD00001 Operation succeeded.

      Note: Double-quotation marks (") are used as the default text delimiter in the ASCII files, do not change the delimiter.

    5. Remove non-active records:

      db2audit prune all

    6. Move the .del files to a storage location where JSA can pull the file. The movement of the comma-delimited (.del) files should be synchronized with the file pull interval in JSA.

      You are now ready to create a log source in JSA to receive DB2® log files.

    Creating a Log Source for IBM DB2

    A log file protocol source allows JSA to retrieve archived log files from a remote host.

    The IBM®DB2® DSM supports the bulk loading of log files by using the log file protocol source. When you configure your IBM®DB2® to use the log file protocol, make sure the host name or IP address that is configured in the IBM®DB2® system is the same as that configured in the Remote Host parameter in the log file protocol configuration.

    1. Log in to JSA.
    2. Click the Admin tab.
    3. Click the Log Sources icon.
    4. Click Add.
    5. In the Log Source Name field, type a name for the log source.
    6. In the Log Source Description field, type a description for the log source.
    7. From the Log Source Type list, select IBM® DB2®.
    8. From the Protocol Configuration list, select Log File.
    9. Configure the following values:

      Table 2: IBM DB2 Log File Protocol Parameters

      Parameter

      Description

      Log Source Identifier

      Type an IP address, host name, or name to identify the event source. Using IP addresses or host names is suggested as they allow JSA to identify a log file to a unique event source.

      For example, if your network contains multiple devices, such as multiple z/OS® images or a file repository that contains all of your event logs, specify a name, IP address, or host name for the image or location that uniquely identifies events for the IBM®DB2® log source. This address specification allows events to be identified at the image or location level in your network that your users can identify.

      Service Type

      From the list, select the protocol that you want to use when retrieving log files from a remote server. The default is SFTP.

      • SFTP SSH File Transfer Protocol

      • FTP File Transfer Protocol

      • SCP Secure Copy

      The underlying protocol that is used to retrieve log files for the SCP and SFTP service type requires that the server specified in the Remote IP or Hostname field has the SFTP subsystem enabled.

      Remote IP or Hostname

      Type the IP address or host name of the device that stores your event log files.

      Remote Port

      Type the TCP port on the remote host that is running the selected Service Type. The valid range is 1 - 65535.

      The options include the following ports:

      • FTP TCP Port 21

      • SFTP TCP Port 22

      • SCP TCP Port 22

      If the host for your event files is using a non-standard port number for FTP, SFTP, or SCP, you must adjust the port value.

      Remote User

      Type the user name necessary to log in to the host that contains your event files.

      The user name can be up to 255 characters in length.

      Remote Password

      Type the password necessary to log in to the host.

      Confirm Password

      Confirm the password necessary to log in to the host.

      SSH Key File

      If you select SCP or SFTP as the Service Type, this parameter gives the option to define an SSH private key file. When you provide an SSH Key File, the Remote Password field is ignored.

      Remote Directory

      Type the directory location on the remote host from which the files are retrieved, relative to the user account you are using to log in.

      For FTP only. If your log files are in the remote user's home directory, you can leave the remote directory blank. This option gives support to operating systems where a change in the working directory (CWD) command is restricted.

      Recursive

      Select this check box if you want the file pattern to search sub folders in the remote directory. By default, the check box is clear.

      The Recursive option is ignored if you configure SCP as the Service Type.

      FTP File Pattern

      If you select SFTP or FTP as the Service Type, this option allows the configuration of the regular expression (regex) required to filter the list of files that are specified in the Remote Directory. All matching files are included in the processing.

      The FTP file pattern that you specify must match the name that you assigned to your event files. For example, to collect comma-delimited files that end with .del, type the following code:

      .*.del

      Use of this parameter requires knowledge of regular expressions (regex). For more information, see the following website: http://download.oracle.com/javase/tutorial/essential/regex/

      FTP Transfer Mode

      From the list, select ASCII for comma-delimited, text, or ASCII log sources that require an ASCII FTP file transfer mode.

      This option displays only if you select FTP as the Service Type.

      SCP Remote File

      If you select SCP as the Service Type you must type the file name of the remote file.

      Start Time

      Type the time of day you want the processing to begin. For example, type 00:00 to schedule the log file protocol to collect event files at midnight.

      This parameter functions with the Recurrence value to establish when and how often the Remote Directory is scanned for files. Type the start time, based on a 24-hour clock, in the following format: HH: MM.

      Recurrence

      Type the frequency, beginning at the Start Time, that you want the remote directory to be scanned. Type this value in hours (H), minutes (M), or days (D).

      For example, type 2H if you want the remote directory to be scanned every 2 hours from the start time. The default is 1H.

      Run On Save

      Select this check box if you want the log file protocol to run immediately after you click Save.

      After the Run On Save completes, the log file protocol follows your configured start time and recurrence schedule.

      Selecting Run On Save clears the list of previously processed files for the Ignore Previously Processed File parameter.

      EPS Throttle

      Type the number of Events Per Second (EPS) that you do not want this protocol to exceed. The valid range is 100 - 5000.

      Processor

      From the list, select None.

      Processors allow event file archives to be expanded and the contents to be processed for events. Files are only processed after they are downloaded to JSA. JSA can process files in zip, gzip, tar, or tar+gzip archive format.

      Ignore Previously Processed File(s)

      Select this check box to track and ignore files that are already processed by the log file protocol.

      JSA examines the log files in the remote directory to determine if a file is previously processed by the log file protocol. If a previously processed file is detected, the log file protocol does not download the file for processing. All files that are not previously processed are downloaded.

      This option applies only to FTP and SFTP Service Types.

      Change Local Directory?

      Select this check box to define a local directory on your JSA for storing downloaded files during processing.

      It is suggested that you leave this check box clear. When this check box is selected, the Local Directory field is displayed, which gives the option to configure the local directory to use for storing files.

      Event Generator

      From the Event Generator list, select LineByLine.

      The Event Generator applies more processing to the retrieved event files. Each line of the file is a single event. For example, if a file has 10 lines of text, 10 separate events are created.

    10. Click Save.
    11. On the Admin tab, click Deploy Changes.

    Modified: 2017-09-13