Help us improve your experience.

Let us know what you think.

Do you have time for a two-minute survey?

 
 

Broadcom CA Top Secret

Broadcom CA Top Secret is formerly known as CA Technologies Top Secret. The name remains CA Top Secret in JSA.

The Broadcom CA Top Secret DSM collects events from a Broadcom CA Technologies Top Secret image on an IBM z/OS mainframe by using IBM Security zSecure.

When you use a zSecure process, events from the System Management Facilities (SMF) can be transformed into Log Event Extended Format (LEEF) events. These events can be sent near real-time by using UNIX Syslog protocol or JSA can retrieve the LEEF event log files by using the Log File protocol and then process the events. When you use the Log File protocol, you can schedule JSA to retrieve events on a polling interval, which enables JSA to retrieve the events on the schedule that you define.

To collect CA Top Secret events, complete the following steps:

  1. Verify that your installation meets any prerequisite installation requirements.

  2. Configure your IBM z/OS image to write events in LEEF format.

  3. Create a log source in JSA for CA Top Secret.

  4. If you want to create a custom event property for CA Top Secret in JSA, for more information, see the Custom Event Properties for IBM z/OS Technical note.

Before You Begin

Before you can configure the data collection process, you must complete the basic zSecure installation process and complete the post-installation activities to create and modify the configuration.

The following prerequisites are required:

  • You must ensure parmlib member IFAPRDxx is enabled for IBM Security zSecure Audit on your z/OS image.

  • The SCKRLOAD library must be APF-authorized.

  • If you are using the direct SMF INMEM real-time interface, you must have the necessary software installed (APAR OA49263) and set up the SMFPRMxx member to include the INMEM keyword and parameters. If you decide to use the CDP interface, you must also have CDP installed and running.

  • You must configure a process to periodically refresh your CKFREEZE and UNLOAD data sets.

  • If you are using the Log File protocol method, you must configure a SFTP, FTP, or SCP server on your z/OS image for JSA to download your LEEF event files.

  • If you are using the Log File protocol method, you must allow SFTP, FTP, or SCP traffic on firewalls that are located between JSA and your z/OS image.

Creating a Log Source for Log File Protocol

The Log File protocol enables JSA to retrieve archived log files from a remote host for the IBM z/OS, IBMCICS, IBM RACF, IBM DB2, CA Top Secret, and CA ACF2 DSM's.

The Log File protocol enables JSA to retrieve archived log files from a remote host for the IBM z/OS, IBMCICS, IBM RACF, IBM DB2, CA Top Secret, and CA ACF2 DSM's.

Log files are transferred, one at a time, to JSA for processing. The Log File protocol can manage plain text event logs, compressed files, or archives. Archives must contain plain-text files that can be processed one line at a time. Multi-line event logs are not supported by the Log File protocol. IBM z/OS with zSecure writes log files to a specified directory as gzip archives. JSA extracts the archive and processes the events, which are written as one event per line in the file.

To retrieve these events, you must create a log source that uses the Log File protocol. JSA requires credentials to log in to the system that hosts your LEEF formatted event files and a polling interval.

  1. Log in to JSA.

  2. Click the Admin tab.

  3. Click the Log Sources icon.

  4. Click Add.

  5. In the Log Source Name field, type a name for the log source.

  6. In the Log Source Description field, type a description for the log source.

  7. From the Log Source Type list, select your DSM name.

  8. From the Protocol Configuration list, select Log File.

  9. Configure the Log File protocol parameters.

    The following table describes the parameters that require specific values for the DSM event collection:

    Table 1: Log File Protocol Parameters

    Parameter

    Value

    Log Source Identifier

    Type an IP address, host name, or name to identify the event source. IP addresses or host names are suggested as they allow JSA to identify a log file to a unique event source.

    For example, if your network contains multiple devices, such as multiple z/OS images or a file repository that contains all of your event logs, you must specify a name, IP address, or host name for the image or location that uniquely identifies events for the DSM log source. This specification enables events to be identified at the image or location level in your network that your users can identify.

    Service Type

    From the Service Type list, select the protocol that you want to use when retrieving log files from a remote server. The default is SFTP.

    • SFTP - SSH File Transfer Protocol

    • FTP - File Transfer Protocol

    • SCP - Secure Copy

    The underlying protocol that is used to retrieve log files for the SCP and SFTP service type requires that the server that is specified in the Remote IP or Hostname field has the SFTP subsystem enabled.

    Remote IP or Hostname

    Type the IP address or host name of the device that stores your event log files.

    Remote Port

    Type the TCP port on the remote host that is running the selected Service Type. The valid range is 1 - 65535.

    The options include ports:

    • FTP - TCP Port 21

    • SFTP - TCP Port 22

    • SCP - TCP Port 22

    If the host for your event files is using a non-standard port number for FTP, SFTP, or SCP, you must adjust the port value.

    Remote User

    Type the user name or user ID necessary to log in to the system that contains your event files.

    • If your log files are on your IBM z/OS image, type the user ID necessary to log in to your IBM z/OS. The user ID can be up to 8 characters in length.

    • If your log files are on a file repository, type the user name necessary to log in to the file repository. The user name can be up to 255 characters in length.

    Remote Password

    Type the password necessary to log in to the host.

    Confirm Password

    Confirm the password necessary to log in to the host.

    SSH Key File

    If you select SCP or SFTP as the Service Type, this parameter gives you the option to define an SSH private key file. When you provide an SSH Key File, the Remote Password field is ignored.

    Remote Directory

    Type the directory location on the remote host from which the files are retrieved, relative to the user account you are using to log in.

    Recursive

    If you want the file pattern to search sub folders in the remote directory, select this check box. By default, the check box is clear.

    If you configure SCP as the Service Type, the Recursive option is ignored.

    FTP File Pattern

    If you select SFTP or FTP as the Service Type, you can configure the regular expression (regex) needed to filter the list of files that are specified in the Remote Directory. All matching files are included in the processing.

    The IBM z/OS mainframe that uses IBM Security zSecure Audit writes event files by using the pattern: <product_name>.<timestamp>.gz

    The FTP file pattern that you specify must match the name that you assigned to your event files. For example, to collect files that start with zOS and end with .gz, type the following code:

    zOS.*\.gz

    Use of this parameter requires knowledge of regular expressions (regex). For more information about regex, see Lesson: Regular Expressions. (http://download.oracle.com/javase/tutorial/essential/regex/)

    FTP Transfer Mode

    This option displays only if you select FTP as the Service Type. From the list, select Binary.

    The binary transfer mode is needed for event files that are stored in a binary or compressed format, such as zip, gzip, tar, or tar+gzip archive files.

    SCP Remote File

    If you select SCP as the Service Type you must type the file name of the remote file.

    Start Time

    Type the time of day you want the processing to begin. For example, type 00:00 to schedule the Log File protocol to collect event files at midnight.

    This parameter functions with the Recurrence value to establish when and how often the Remote Directory is scanned for files. Type the start time, based on a 24-hour clock, in the following format: HH: MM.

    Recurrence

    Type the frequency, beginning at the Start Time, that you want the remote directory to be scanned. Type this value in hours (H), minutes (M), or days (D).

    For example, type 2H if you want the remote directory to be scanned every 2 hours from the start time. The default is 1H.

    Run On Save

    If you want the Log File protocol to run immediately after you click Save, select this check box.

    After the Run On Save completes, the Log File protocol follows your configured start time and recurrence schedule.

    Selecting Run On Save clears the list of previously processed files for the Ignore Previously Processed File parameter.

    EPS Throttle

    Type the number of Events Per Second (EPS) that you do not want this protocol to exceed. The valid range is 100 - 5000.

    Processor

    From the list, select gzip.

    Processors enable event file archives to be expanded and contents are processed for events. Files are processed after they are downloaded to JSA. JSA can process files in zip, gzip, tar, or tar+gzip archive format.

    Ignore Previously Processed File(s)

    Select this check box to track and ignore files that are already processed by the Log File protocol.

    JSA examines the log files in the remote directory to determine whether a file is previously processed by the Log File protocol. If a previously processed file is detected, the Log File protocol does not download the file for processing. All files that are not previously processed are downloaded.

    This option applies only to FTP and SFTP service types.

    Change Local Directory?

    Select this check box to define a local directory on your JSA for storing downloaded files during processing.

    It is suggested that you leave this check box clear. When this check box is selected, the Local Directory field is displayed, which gives you the option to configure the local directory to use for storing files.

    Event Generator

    From the Event Generator list, select LineByLine.

    The Event Generator applies more processing to the retrieved event files. Each line is a single event. For example, if a file has 10 lines of text, 10 separate events are created.

  10. Click Save.

  11. On the Admin tab, click Deploy Changes.

    The DSM configuration is complete. If your DSM requires custom event properties, see the Custom Event Properties for IBM z/OS Technical note.

Create a Log Source for Near Real-time Event Feed

The Syslog protocol enables JSA to receive System Management Facilities (SMF) events in near real-time from a remote host.

The following DSMs are supported:

  • IBM z/OS

  • IBM CICS

  • IBM RACF

  • IBM DB2

  • CA Top Secret

  • CA ACF2

If JSA does not automatically detect the log source, add a log source for your DSM on the JSA console.

The following table describes the parameters that require specific values for event collection for your DSM:

Table 2: Log Source Parameters

Parameter

Value

Log Source type

Select your DSM name from the list.

Protocol Configuration

Syslog

Log Source Identifier

Type a unique identifier for the log source.

Integrate Broadcom CA Top Secret with JSA by Using Audit Scripts

The Broadcom CA Top Secret DSM collects events and audit transactions on the IBM mainframe with the Log File protocol.

JSA records all relevant and available information from the event.

To integrate CA Top Secret events into JSA:

  1. The IBM mainframe records all security events as Service Management Framework (SMF) records in a live repository.

  2. At midnight, the CA Top Secret data is extracted from the live repository by using the SMF dump utility. The SMF file contains all of the events and fields from the previous day in raw SMF format.

  3. The qextopsloadlib program pulls data from the SMF formatted file. The qextopsloadlib program only pulls the relevant events and fields for JSA and writes that information in a condensed format for compatibility. The information is saved in a location accessible by JSA.

  4. JSA uses the Log File protocol source to retrieve the output file information on a scheduled basis. JSA then imports and processes this file.

Configuring Broadcom CA Top Secret That Uses Audit Scripts to Integrate with JSA

The Broadcom CA Top Secret DSM collects events and audit transactions on the IBM mainframe by using the Log File protocol.

  1. From the https://support.juniper.net/support/downloads/, download the following compressed file:

    qextops_bundled.tar.gz

  2. On a Linux operating system, extract the file:

    tar -zxvf qextops_bundled.tar.gz

    The following files are contained in the archive:

    • qextops_jcl.txt

    • qextopsloadlib.trs

    • qextops_trsmain_JCL.txt

  3. Load the files onto the JSA mainframe by using any terminal emulator file transfer method.

    Upload the sample qextops_trsmain_JCL.txt and qextops_jcl.txt files by using the TEXT protocol.

  4. Upload the qextopsloadlib.trs file by using a BINARY mode transfer. The qextopsloadlib.trs file is a tersed file that contains the executable (the mainframe program qextops). When you upload the .trs file from a workstation, preallocate a file on the mainframe with the following DCB attributes: DSORG=PS, RECFM=FB, LRECL=1024, BLKSIZE=6144. The file transfer type must be binary mode and not text.

    Note:

    Qextops is a small C mainframe program that reads the output of the TSSUTIL (EARLOUT data) line by line. Qextops adds a header to each record that contains event information, for example, record descriptor, the date, and time. The program places each field into the output record, suppresses trailing blank characters, and delimits each field with the pipe character. This output file is formatted for JSA and the blank suppression reduces network traffic to JSA. This program does not consume CPU or I/O disk resources.

  5. Customize the qextops_trsmain_JCL.txt file according to your installation-specific requirements.

    The qextops_trsmain_JCL.txt file uses the IBM utility TRSMAIN to extract the program that is stored in the qextopsloadlib.trs file.

    An example of the qextops_trsmain_JCL.txt file includes:

    //TRSMAIN JOB (yourvalidjobcard),Q1labs, // MSGCLASS=V //DEL EXEC PGM=IEFBR14 //D1 DD DISP=(MOD,DELETE),DSN=<yourhlq>.QEXTOPS.TRS // UNIT=SYSDA, // SPACE=(CYL,(10,10)) //TRSMAIN EXEC PGM=TRSMAIN,PARM='UNPACK' //SYSPRINT DD SYSOUT=*,DCB=(LRECL=133,BLKSIZE=12901,RECFM=FBA) //INFILE DD DISP=SHR,DSN=<yourhlq>.QEXTOPS.TRS //OUTFILE DD DISP=(NEW,CATLG,DELETE), // DSN=<yourhlq>.LOAD, // SPACE=(CYL,(10,10,5),RLSE),UNIT=SYSDA //

    You must update the file with your installation specific information for parameters, such as, jobcard, data set naming conventions, output destinations, retention periods, and space requirements.

    The .trs input file is an IBM TERSE formatted library and is extracted by running the JCL, which calls the TRSMAIN. This tersed file, when extracted, creates a PDS linklib with the qextops program as a member.

  6. You can STEPLIB to this library or choose to move the program to one of the LINKLIBs that are in the LINKLST. The program does not require authorization.

  7. Following the upload, copy the program to an existing link listed library or add a STEPLIB DD statement with the correct dataset name of the library that contains the program.

  8. The qextops_jcl.txt file is a text file that contains a sample JCL. You must configure the job card to meet your configuration.

    The qextops_jcl.txt sample file includes:

    //QEXTOPS JOB (T,JXPO,JKSD0093),DEV,NOTIFY=Q1JACK, // MSGCLASS=P, // REGION=0M //* //*QEXTOPS JCL version 1.0 September, 2010 //* //************************************************************* //* Change below dataset names to sites specific datasets names* //************************************************************ //SET1 SET TSSOUT='Q1JACK.EARLOUT.ALL', // EARLOUT='Q1JACK.QEXTOPS.PROGRAM.OUTPUT' //************************************************************ //* Delete old datasets * //************************************************************//

    DEL EXEC PGM=IEFBR14 //DD1 DD DISP=(MOD,DELETE),DSN=&TSSOUT, // UNIT=SYSDA, // SPACE=(CYL,(10,10)), // DCB=(RECFM=FB,LRECL=80) //DD2 DD DISP=(MOD,DELETE),DSN=&EARLOUT, // UNIT=SYSDA, // SPACE=(CYL,(10,10)), // DCB=(RECFM=FB,LRECL=80) //************************************************************ //* Allocate new dataset * //************************************************************ //ALLOC EXEC PGM=IEFBR14 //DD1 DD DISP=(NEW,CATLG),DSN=&EARLOUT, // SPACE=(CYL,(100,100)), // DCB=(RECFM=VB,LRECL=1028,BLKSIZE=6144) //************************************************************ //* Execute Top Secret TSSUTIL utility to extract smf records* //************************************************************ //REPORT EXEC PGM=TSSUTIL //SMFIN DD DISP=SHR,DSN=&SMFIN1 //SMFIN1 DD DISP=SHR,DSN=&SMFIN2 //UTILOUT DD DSN=&UTILOUT, // DISP=(,CATLG),UNIT=SYSDA,SPACE=(CYL,(50,10),RLSE), // DCB=(RECFM=FB,LRECL=133,BLKSIZE=0) //EARLOUT DD DSN=&TSSOUT, // DISP=(NEW,CATLG),UNIT=SYSDA, // SPACE=(CYL,(200,100),RLSE), // DCB=(RECFM=VB,LRECL=456,BLKSIZE=27816) //UTILIN DD * NOLEGEND REPORT EVENT(ALL) END /* //************************************************************ //EXTRACT EXEC PGM=QEXTOPS,DYNAMNBR=10, // TIME=1440 //STEPLIB DD DISP=SHR,DSN=Q1JACK.C.LOAD //SYSTSIN DD DUMMY //SYSTSPRT DD SYSOUT=* //SYSPRINT DD SYSOUT=* //CFG DD DUMMY //EARLIN DD DISP=SHR,DSN=&TSSOUT //EARLOUT DD DISP=SHR,DSN=&EARLOUT //************************************************************ //FTP EXEC PGM=FTP,REGION=3800K //INPUT DD * <IPADDR> <USER> <PASSWORD> PUT '<EARLOUT>' EARL_<THEIPOFTHEMAINFRAMEDEVICE>/<QUIT //OUTPUT DD SYSOUT=* //SYSPRINT DD SYSOUT=*

  9. After the output file is created, schedule a job to a transfer the output file to an interim FTP server. The output file is forwarded to an interim FTP server.

    You must configure the following parameters in the sample JCL to successfully forward the output to an interim FTP server:

    Where:

    <IPADDR> is the IP address or host name of the interim FTP server to receive the output file.

    <USER> is the user name that is needed to access the interim FTP server.

    <PASSWORD> is the password that is needed to access the interim FTP server.

    <THEIPOFTHEMAINFRAMEDEVICE> is the destination of the mainframe or interim FTP server that receives the output.

    <QEXOUTDSN> is the name of the output file that is saved to the interim FTP server.

    You are now ready to configure the Log File protocol.

  10. Schedule JSA to collect the output file from CA Top Secret.

    If the zOS platform is configured to serve files through FTP, SFTP, or allow SCP, then no interim FTP server is needed and JSA can pull the output file directly from the mainframe. The following text must be commented out using //* or deleted from the qextops_jcl.txt file:

    //FTP EXEC PGM=FTP,REGION=3800K //INPUT DD * <IPADDR> <USER> <PASSWORD> PUT '<EARLOUT>' EARL_<THEIPOFTHEMAINFRAMEDEVICE>/<EARLOUT> QUIT //OUTPUT DD SYSOUT=* //SYSPRINT DD SYSOUT=*

You are now ready to configure the log source in JSA.