SRX5800 Line Cards and Modules

 

SRX5400, SRX5600, and SRX5800 Services Gateway Card Overview

The cards described in this guide let you upgrade and customize your SRX5400, SRX5600, or SRX5800 Services Gateway to suit the needs of your network. The following types of cards are available for the SRX5400, SRX5600, and SRX5800 Services Gateways:

  • I/O cards (IOCs) provide additional physical network connections to the services gateway. Their primary function is to deliver data packets arriving on the physical ports to the Services Processing Cards (SPCs) and to forward data packets out the physical ports after services processing.

  • Flex IOCs have two slots for port modules that add additional physical network connections to the services gateway. Like IOCs, their primary function is to deliver data packets arriving on the physical ports to the SPCs and to forward data packets out the physical ports after services processing.

  • Modular Port Concentrators (MPCs) have slots on the front panel that accept smaller cards called Modular Interface Cards (MICs). Each MIC has one or more physical interfaces on it. An MPC with MICs installed functions in the same way as a regular I/O card (IOC), but allows greater flexibility in adding different types of Ethernet ports to your services gateway. MPCs and MICs are similar in form and function to Flex IOCs and port modules. However, the two use different form-factors, so you cannot install port modules in an MPC, nor can you install MICs in a Flex IOC.

  • Services Processing Cards (SPCs) provide the processing power to run integrated services such as firewall, IPsec and IDP. All traffic traversing the services gateway is passed to an SPC to have services processing applied to it.

  • Switch Control Boards (SCBs) power on and power off IOCs and SPCs; control clocking and system resets; and control booting, monitor, and system functions. Each SCB has a slot in the front panel for a Routing Engine.

Although the following modules are not cards in the sense of having a form-factor that fits the card cage of the SRX5400, SRX5600, and SRX5800 Services Gateway, this guide also addresses the following modules that fit into certain SRX5400, SRX5600, and SRX5800 Services Gateway cards:

  • Routing Engines fit into slots in SCBs and maintain the routing tables, manage the routing protocols used on the device, control the device interfaces and some chassis components, and provide the interface for system management and user access to the device.

  • Port modules fit into slots in Flex IOCs and add additional physical network interface ports to the services gateway.

  • Modular Interface Cards (MICs) fit into slots in MPCs and add additional physical network interface ports to the services gateway. MPCs and MICs are similar in form and function to Flex IOCs and port modules. However, the two use different form-factors, so you cannot install port modules in an MPC, nor can you install MICs in a Flex IOC.

Cards Supported on SRX5400, SRX5600, and SRX5800 Services Gateways

Table 1 describes the cards and other modules supported on the SRX5400, SRX5600, and SRX5800 Services Gateways.

Table 1: Supported Cards for SRX5400, SRX5600, and SRX5800 Services Gateways

Card Name and Model Number

Earliest Supported Junos OS Release

SRX5400

SRX5600 and SRX5800

SPCs

Services Processing Card SRX5K-SPC-2-10-40 Specifications

Not supported

9.2

Services Processing Card SRX5K-SPC-4-15-320 Specifications

12.1X46-D10

12.1X44-D10

Services Processing Card SRX5K-SPC3 Specifications

18.2R1-S1

18.2R1-S1

Interface Cards

I/O Card SRX5K-40GE-SFP Specifications

Not supported

9.2

I/O Card SRX5K-4XGE-XFP Specifications

Not supported

9.2

Flex I/O Card (SRX5K-FPC-IOC) Specifications

Not supported

10.2

Modular Port Concentrator (SRX5K-MPC) Specifications

12.1X46-D10

12.1X46-D10

SRX5K-MPC3-40G10G Specifications

15.1X49-D10

15.1X49-D10

SRX5K-MPC3-100G10G Specifications

15.1X49-D10

15.1X49-D10

SCBs

Switch Control Board SRX5K-SCB Specifications

12.1X46-D10

9.2

Switch Control Board SRX5K-SCBE Specifications

12.1X47-D15

12.1X47-D15

Switch Control Board SRX5K-SCB3 Specifications

15.1X49-D10

15.1X49-D10

Other modules

Flex I/O Card Port Module SRX-IOC-16GE-SFP Specifications

Not supported

10.2

Flex I/O Card Port Module SRX-IOC-16GE-TX Specifications

Not supported

10.2

Flex I/O Card Port Module SRX-IOC-4XGE-XFP Specifications

Not supported

10.2

MIC with 1x100GE CFP Interface (SRX-MIC-1X100G-CFP)

12.1X46-D10

12.1X46-D10

MIC with 2x40GE QSFP+ Interfaces (SRX-MIC-2X40G-QSFP)

12.1X46-D10

12.1X46-D10

MIC with 10x10GE SFP+ Interfaces (SRX-MIC-10XG-SFPP)

12.1X46-D10

12.1X46-D10

MIC with 20x1GE SFP Interfaces (SRX-MIC-20GE-SFP)

12.1X47-D10

12.1X47-D10

Routing Engine SRX5K-RE-13-20 Specifications

12.1X46-D10

9.2

Routing Engine SRX5K-RE-1800X4 Specifications

12.1X47-D15

12.1X47-D15

Figure 1 is an interoperability matrix that describes the compatibility between various interface cards for the SRX5400, SRX5600, and SRX5800 Services Gateways.

Figure 1: Interoperability Matrix for SRX5400, SRX5600, and SRX5800 Services Gateways
Interoperability Matrix for SRX5400,
SRX5600, and SRX5800 Services Gateways

SRX5800 Services Gateway Card Cage and Slots

The card cage is the set of 14 vertical slots in the front of the chassis where you install cards. The slots are numbered from left to right. Table 2 describes the types of cards that you can install into each slot.

Table 2: SRX5800 Services Gateway Card Cage Slots

Card Cage Slot

Eligible Cards

SPC

SPC3

IOC, Flex IOC, or MPC

SCB

IOC3

0 (leftmost)

X

X

X

  

1

X

X

X

 

X

2

X

X

X

 

X

3

X

X

X

 

X

4

X

X

X

 

X

5

X

X

X

 

X

0

   

X

 

1

   

X

 

2/6

X

X

X

X

X

7

X

X

X

 

X

8

X

X

X

 

X

9

X

X

X

 

X

10

X

X

X

 

X

11 (rightmost)

X

 

X

  
Note

For operational and cooling efficiency in SRX5800 Services Gateways, we recommend that slot 0 and 11 be filled last.

SRX5800 Services Gateway SPC Description

The Services Processing Card (SPC) has Services Processing Units (SPUs), which provide the processing power to run integrated services such as firewall, IPsec, and IDP (see Figure 2). All traffic traversing the services gateway is passed to an SPU to have services processing applied to it. Traffic is intelligently distributed by interface cards to SPUs for services processing.

The services gateway must have one SPC installed.

You can install an SPC in any of the slots that are not reserved for Switch Control Board (SCB). If a slot is not occupied by a card, you must install a blank panel to shield the empty slot and to allow cooling air to circulate properly through the device.

Figure 2 shows a typical SPC supported on the services gateway.

Figure 2: Typical SPC
Typical SPC

For detailed information about SPCs supported by the services gateway, see the SRX5400, SRX5600, and SRX5800 Services Gateway Card Reference   at www.juniper.net/documentation/.

Services Processing Card SRX5K-SPC-2-10-40 Specifications

The SRX5K-SPC-2-10-40 Services Processing Card (SPC) contains two Services Processing Units (SPUs), which provide the processing power to run integrated services such as firewall, IPsec, and IDP (see Figure 3). All traffic traversing the services gateway is passed to an SPU to have services processing applied to it. Traffic is intelligently distributed by I/O cards (IOCs) to SPUs for services processing.

The services gateway must have at least one SPC installed. You can install additional SPCs to increase services processing capacity.

You can install SPCs in any of the slots that are not reserved for Switch Control Boards (SCBs). If a slot is not occupied by a card, you must install a blank panel to shield the empty slot and to allow cooling air to circulate properly through the device.

Figure 3 shows a typical SPC supported on the services gateway.

Figure 3: Services Processing Card SRX5K-SPC-2-10-40
Services Processing Card
SRX5K-SPC-2-10-40

Each SPC consists of the following components:

  • SPC cover, which functions as a ground plane and a stiffener.

  • Two small form-factor pluggable (SFP) chassis cluster control ports for connecting multiple devices into a redundant chassis cluster. See Chassis Cluster Feature Guide for SRX Series Devices for more information about connecting and configuring redundant chassis clusters.

    Caution

    If you face a problem running a Juniper Networks device that uses a third-party optic or cable, the Juniper Networks Technical Assistance Center (JTAC) can help you diagnose the source of the problem. Your JTAC engineer might recommend that you check the third-party optic or cable and potentially replace it with an equivalent Juniper Networks optic or cable that is qualified for the device.

  • Fabric interfaces.

  • Two Gigabit Ethernet interfaces that allow control information, route information, and statistics to be sent between the Routing Engine and the CPU on the SPCs.

  • Two interfaces from the SCBs that enable the boards to be powered on and controlled.

  • Physical SPC connectors.

  • Midplane connectors and power circuitry.

  • Processor subsystem, which includes a 1.2-GHz CPU, system controller, and 1 GB of SDRAM.

  • LEDs on the faceplate that indicate the SPC and SPU status.

Description

SPC with two SPUs

Software release

  • Junos OS Release 9.2 and later

Cables and connectors

CHASSIS CLUSTER CONTROL 0 and CHASSIS CLUSTER CONTROL 1–SFP ports for control links in chassis cluster configurations.

Supported SFP transceivers:

1000BASE-LH (model numbers SRX-SFP-1GE-LH, SRX-SFP-1GE-LH-ET)

1000BASE-LX (model numbers SRX-SFP-1GE-LX, SRX-SFP-1GE-LX-ET)

1000BASE-SX (model numbers SRX-SFP-1GE-SX, SRX-SFP-1GE-SX-ET)

Controls

None

Supported Slots

  • SRX5600–Any slot, except the bottom slots 0 or 1 which are reserved for SCB/RE.

  • SRX5800–Any slot, except the slots 0 or 1 which are reserved for SCB/RE.

Power Requirement

Maximum 351 W

Weight

Approximately 13 lb (5.9 kg)

LEDs

OK/FAIL LED, one bicolor:

  • Steady green–The SPC is operating normally.

  • Red–The SPC has failed and is not operating normally.

  • Off–The SPC is powered down.

STATUS LED, one tricolor for each of the two SPUs SPU 0 and SPU 1:

  • Green–The SPU is operating normally.

  • Amber–The SPU is initializing.

  • Red–The SPU has encountered an error or a failure.

  • Off–The SPU is offline. If all four SPUs are offline, it is safe to remove the SPC from the chassis.

SERVICE LED, one bicolor for each of the two SPUs, SPU 0 and SPU 1:

  • Green–Service is running on the SPU under acceptable load.

  • Amber–Service on the SPU is overloaded.

  • Off–Service is not running on the SPU.

HA LED, one tricolor:

  • Green–Clustering is operating normally. All cluster members and monitored links are available, and no error conditions are detected.

  • Red–A critical alarm is present on clustering. A cluster member is missing or unreachable, or the other node is no longer part of a cluster because it has been disabled by the dual membership and detection recovery process in reaction to a control link or fabric link failure.

  • Amber–All cluster members are present, but an error condition has compromised the performance and resiliency of the cluster. The reduced bandwidth could cause packets to be dropped or could result in reduced resiliency because a single point of failure might exist. The error condition might be caused by:

    • The loss of chassis cluster links which causes an interface monitoring failure.

    • An error in an SPU or NPU.

    • Failure of the spu-monitoring or cold-sync-monitoring processes.

    • A chassis cluster IP monitoring failure.

LINK/ACT LED, one for each of the two ports CHASSIS CLUSTER CONTROL 0 and CHASSIS CLUSTER CONTROL 1:

  • Green–Chassis cluster control port link is active.

  • Off–No link.

ENABLE LED, one for each of the two ports CHASSIS CLUSTER CONTROL 0 and CHASSIS CLUSTER CONTROL 1:

  • Green–The chassis cluster control port is enabled.

  • Off–The chassis cluster control port is disabled.

Serial Number Location

The serial number label is located as shown in Figure 4.

Figure 4: Serial Number Label (IOC Shown, Other Cards Similar)
Serial Number Label (IOC Shown, Other
Cards Similar)

Services Processing Card SRX5K-SPC-4-15-320 Specifications

The SRX5K-SPC-4-15-320 Services Processing Card (SPC) contains four Services Processing Units (SPUs), which provide the processing power to run integrated services such as firewall, IPsec, and IDP (see Figure 5). All traffic traversing the services gateway is passed to an SPU to have services processing applied to it. Traffic is intelligently distributed by I/O cards (IOCs) to SPUs for services processing.

The services gateway must have at least one SPC installed. You can install additional SPCs to increase services processing capacity.

You can install SPCs in any of the slots that are not reserved for Switch Control Boards (SCBs). If a slot is not occupied by a card, you must install a blank panel to shield the empty slot and to allow cooling air to circulate properly through the device.

If your services gateway contains a mix of SRX5K-SPC-4-15-320 SPCs and earlier SRX5K-SPC-2-10-40 SPCs, an SRX5K-SPC-4-15-320 SPC must occupy the lowest-numbered slot of any SPC in the chassis. This configuration ensures that the center point (CP) function is performed by the faster and higher-performance SPC type.

Figure 5: Services Processing Card SRX5K-SPC-4-15-320
Services Processing Card SRX5K-SPC-4-15-320

Each SPC consists of the following components:

  • SPC cover, which functions as a ground plane and a stiffener.

  • Two small form-factor pluggable (SFP) chassis cluster control ports for connecting multiple devices into a redundant chassis cluster. See Chassis Cluster Feature Guide for SRX Series Devices for more information about connecting and configuring redundant chassis clusters.

    Caution

    If you face a problem running a Juniper Networks device that uses a third-party optic or cable, the Juniper Networks Technical Assistance Center (JTAC) can help you diagnose the source of the problem. Your JTAC engineer might recommend that you check the third-party optic or cable and potentially replace it with an equivalent Juniper Networks optic or cable that is qualified for the device.

  • Fabric interfaces.

  • Two Gigabit Ethernet interfaces that allow control information, route information, and statistics to be sent between the Routing Engine and the CPU on the SPCs.

  • Two interfaces from the SCBs that enable the boards to be powered on and controlled.

  • Physical SPC connectors.

  • Midplane connectors and power circuitry.

  • Processor subsystem, which includes a 1.2-GHz CPU, system controller, and 1 GB of SDRAM.

  • LEDs on the faceplate that indicate the SPC and SPU status.

Description

SPC with four SPUs

Software release

  • Junos OS Release 12.1X44-D10 and later

Cables and connectors

CHASSIS CLUSTER CONTROL 0 and CHASSIS CLUSTER CONTROL 1–SFP ports for control links in chassis cluster configurations.

Supported SFP transceivers:

1000BASE-LH (model numbers SRX-SFP-1GE-LH, SRX-SFP-1GE-LH-ET)

1000BASE-LX (model numbers SRX-SFP-1GE-LX, SRX-SFP-1GE-LX-ET)

1000BASE-SX (model numbers SRX-SFP-1GE-SX, SRX-SFP-1GE-SX-ET)

Controls

None

Supported Slots

  • SRX5400–Any slot, except the bottom slot 0 which is reserved for SCB/RE.

  • SRX5600–Any slot, except the bottom slots 0 or 1 which are reserved for SCB/RE.

  • SRX5800–Any slot, except the slots 0 or 1 which are reserved for SCB/RE.

Power Requirement

475 W typical, 585 W maximum

Note:

  • In the SRX5600 and SRX5800 Services Gateways, you must have high-capacity power supplies (either AC or DC) and high-capacity fan trays installed in the services gateway in order to install and use SRX5K-SPC-4-15-320 SPCs. If you do not have high-capacity power supplies and fan trays installed, the services gateway will log an alarm condition when it recognizes the SRX5K-SPC-4-15-320 SPCs.

  • On SRX5600 Services Gateways with AC power supplies, we recommend that you use high-line (220v) input power to ensure the device has adequate power to support SRX5K-SPC-4-15-320 SPCs.

Weight

Approximately 18 lb (8.3 kg)

LEDs

OK/FAIL LED, one bicolor:

  • Steady green–The SPC is operating normally.

  • Red–The SPC has failed and is not operating normally.

  • Off–The SPC is powered down.

STATUS LED, one tricolor for each of the four SPUs SPU 0 through SPU 3:

  • Green–The SPU is operating normally.

  • Amber–The SPU is initializing.

  • Red–The SPU has encountered an error or a failure.

  • Off–The SPU is offline. If all four SPUs are offline, it is safe to remove the SPC from the chassis.

SERVICE LED, one bicolor for each of the four SPUs SPU 0 through SPU 3:

  • Green–Service is running on the SPU under acceptable load.

  • Amber–Service on the SPU is overloaded.

  • Off–Service is not running on the SPU.

HA LED, one tricolor:

  • Green–Clustering is operating normally. All cluster members and monitored links are available, and no error conditions are detected.

  • Red–A critical alarm is present on clustering. A cluster member is missing or unreachable, or the other node is no longer part of a cluster because it has been disabled by the dual membership and detection recovery process in reaction to a control-link or fabric-link failure.

  • Amber–All cluster members are present, but an error condition has compromised the performance and resiliency of the cluster. The reduced bandwidth could cause packets to be dropped or could result in reduced resiliency because a single point of failure might exist. The error condition might be caused by:

    • The loss of chassis cluster links which causes an interface monitoring failure.

    • An error in an SPU or NPU.

    • Failure of the spu-monitoring or cold-sync-monitoring processes.

    • A chassis cluster IP monitoring failure.

  • Off–The node is not configured for clustering or it has been disabled by the dual membership and detection recovery process in reaction to a control link or fabric link failure.

LINK/ACT LED, one for each of the two ports CHASSIS CLUSTER CONTROL 0 and CHASSIS CLUSTER CONTROL 1:

  • Green–Chassis cluster control port link is active.

  • Off–No link.

ENABLE LED, one for each of the two ports CHASSIS CLUSTER CONTROL 0 and CHASSIS CLUSTER CONTROL 1:

  • Green–The chassis cluster control port is enabled.

  • Off–The chassis cluster control port is disabled.

Serial Number Location

The serial number label is located as shown in Figure 6.

Figure 6: Serial Number Label (IOC Shown, Other Cards Similar)
Serial Number Label (IOC Shown, Other
Cards Similar)

Services Processing Card SRX5K-SPC3 Specifications

The SRX5K-SPC3 Services Processing Card (SPC) contains two Services Processing Units (SPUs) with 128GB of memory per SPU, that provide the processing power to run integrated services such as firewall, IPsec, and IDP (see Figure 7). All traffic traversing the services gateway is passed to an SPU to have services processing applied to it. Traffic is intelligently distributed by I/O cards (IOCs) to SPUs for services processing.

The services gateway must have at least one SPC installed. You can install additional SPCs to increase services processing capacity.

SPCs cannot be installed in slots that are reserved for Switch Control Boards (SCBs) or in slot 11 on the SRX5800. If a slot is not occupied by a card, you must install a blank panel to shield the empty slot and to allow cooling air to circulate properly through the device.

Note

Your services gateway cannot have a mix of SRX5K-SPC-2-10-40 SPCs and SRX5K-SPC3 SPCs. Starting with Junos OS release 18.2R2 and then 18.4R1 but not 18.3R1, you can have a mix of SRX5K-SPC-4-15-320 SPCs and SRX5K-SPC3 SPCs.

Figure 7: Services Processing Card SRX5K-SPC3
Services Processing Card SRX5K-SPC3

Each SPC consists of the following components:

  • SPC cover, which functions as a ground plane and a stiffener.

  • Two 10–Gigabit Ethernet small form-factor pluggable plus (SFP+) chassis cluster control ports for connecting multiple devices into a redundant chassis cluster. See the Chassis Cluster Feature Guide for SRX Series Devices for more information about connecting and configuring redundant chassis clusters.

    Caution

    If you face a problem running a Juniper Networks device that uses a third-party optic or cable, the Juniper Networks Technical Assistance Center (JTAC) can help you diagnose the source of the problem. Your JTAC engineer might recommend that you check the third-party optic or cable and potentially replace it with an equivalent Juniper Networks optic or cable that is qualified for the device.

  • Fabric interfaces

  • One Gigabit Ethernet switch that provides control connectivity to the Routing Engine.

  • Two interfaces from the SCBs that enable the boards to be powered on and controlled.

  • Physical SPC connectors

  • Midplane connectors and power circuitry.

  • Processor subsystem, which includes a 2.3-GHz CPU, system controller, and two 128 GB solid state-drives (SSDs).

  • LEDs on the faceplate that indicate the SPC and SPU status.

Description

SPC with two SPUs of 256 GB memory.

Software release

  • Junos OS Release 18.2R1-S1

Cables and connectors

HA0 and HA1 SFP+ ports for control links in chassis cluster configurations.

Supported transceivers:

  • 10GBASE–LR: transceiver model number SRX-SFP-10GE-LR

  • 10GBASE–SR: transceiver model number SRX-SFP-10GE-SR

Controls

None

Supported Slots

  • SRX5400–Any slot, except the bottom slot 0 which is reserved for SCB/RE.

  • SRX5600–Any slot, except the bottom slots 0 or 1 which are reserved for SCB/RE.

  • SRX5800–Any slot, except slot 11, and the slots 0 or 1 which are reserved for SCB/RE.

Power Requirement

650 W maximum

Note:

  • In the SRX5600 and SRX5800 Services Gateways, you must have high-capacity power supplies (either AC or DC) and high-capacity fan trays installed in the services gateway in order to install and use SRX5K-SPC3 SPCs. If you do not have high-capacity power supplies and fan trays installed, the services gateway will log an alarm condition when it recognizes the SRX5K-SPC3 SPCs.

  • On SRX5600 Services Gateways with AC power supplies, we recommend that you use high-line (220v) input power to ensure the device has adequate power to support SRX5K-SPC3 SPCs.

Weight

Approximately 18 lb (8.3 kg)

LEDs

OK/FAIL LED, one bicolor:

  • Steady green–The SPC is operating normally.

  • Red–The SPC has failed and is not operating normally.

  • Off–The SPC is powered down.

STATUS LED, one tricolor for each SPU SPU 0 and SPU 1:

  • Off–The SPU is offline.

  • Blinking Amber–The SPU is initializing.

  • Green–The SPU initialization is done and it is operating normally.

  • Red–The SPU has encountered an error or a failure.

SERVICE LED, one tricolor for each SPU SPU 0 and SPU 1:

  • Off–The SPU is offline.

  • Blinking Red–The SPU initialization is done.

  • Blinking Amber–Service is initializing on the SPU.

  • Green–Service is running on the SPU under acceptable load.

  • Solid Red–Service encountered an error or a failure.

HA LED, one tricolor:

  • Green–Clustering is operating normally. All cluster members and monitored links are available, and no error conditions are detected.

  • Red–A critical alarm is present on clustering. A cluster member is missing or unreachable, or the other node is no longer part of a cluster because it has been disabled by the dual membership and detection recovery process in reaction to a control-link or fabric-link failure.

  • Amber–All cluster members are present, but an error condition has compromised the performance and resiliency of the cluster. The reduced bandwidth could cause packets to be dropped or could result in reduced resiliency because a single point of failure might exist. The error condition might be caused by:

    • The loss of chassis cluster links which causes an interface monitoring failure.

    • An error in an SPU or NPU.

    • Failure of the spu-monitoring or cold-sync-monitoring processes.

    • A chassis cluster IP monitoring failure.

  • Off–The node is not configured for clustering or it has been disabled by the dual membership and detection recovery process in reaction to a control link or fabric link failure.

LINK/ACT LED, one for each of the two ports CHASSIS CLUSTER CONTROL 0 and CHASSIS CLUSTER CONTROL 1:

  • Green–Chassis cluster control port link is active.

  • Off–No link.

Modular Port Concentrator (SRX5K-MPC) Specifications

The SRX5K-MPC (see Figure 8) is an interface card with two slots that accept MICs. These MICs add Ethernet ports to your services gateway. An MPC with MICs installed functions in the same way as a regular IOC but allows you to add different types of Ethernet ports to your services gateway. MPCs and MICs are similar in form and function to Flex IOCs and port modules. However, the two use different form-factors, so you cannot install port modules in an MPC, nor can you install MICs in a Flex IOC.

You must install at least one interface card in the services gateway. The interface card can be of any of the available IOC, Flex IOC, or MPC types. You can add just one MIC; or you can add two MICs of the same or different types.

You can install MPCs in any of the slots that are not reserved for Switch Control Boards (SCBs).

If a slot in the SRX5400, SRX5600, or SRX5800 Services Gateway card cage is not occupied by a card, you must install a blank panel to shield the empty slot and to allow cooling air to circulate properly through the services gateway. If a slot in an MPC is not occupied by a MIC, you must install a blank panel in the empty MIC slot to shield it and to allow cooling air to circulate properly through the MPC.

Figure 8: SRX5K-MPC
SRX5K-MPC
Note

When installing an SRX5K-MPC in an SRX5600 or SRX5800 Services Gateway:

  • If the session-distribution-mode has not been explicitly configured using the CLI command:

    user@host set security forwarding-process application-services session-distribution-mode

    The SRX5K-MPC defaults to hash-based mode automatically even if existing SRX5K-MPC or non-MPCs are installed. You cannot set the session-distribution-mode to normal.

  • If the session-distribution-mode has been explicitly configured to normal, and the MIC is installed in the device, then the SRX5K-MPC will remain offline, and the services gateway generates a major alarm and logs the event for troubleshooting. You must explicitly configure the session-distribution-mode using the CLI command:

    user@host set security forwarding-process application-services session-distribution-mode hash-based

When installing an SRX5K-MPC in an SRX5400 Services Gateway, the session-distribution-mode will only function when hash-based mode is configured or set as the default. The normal mode is not supported.

A 9% drop is observed for PPS (throughput) when moving from session mode to hash mode (for SRX5K-MPC or non-MPCs), whereas no drop in performance is observed on CPS (connection per second) and session capacity numbers.

For more information about the CLI command, see the Junos OS documentation at www.juniper.net/documentation/.

Description

  • MPC with slots for two MICs

  • Maximum throughput:

    75 Gbps per slot from Junos OS Release 12.1X46-D10 and later

    120 Gbps per slot from Junos OS Release 12.1x47-D15 and later

Software release

Junos OS Release 12.1x46-D10

Cables and connectors

Slots for two MICs

Controls

One ejector knob each for MIC slots 0 and 1. Pull the ejector knob to unseat and partially eject the adjacent MIC.

Supported slots

  • SRX5400–Any slot except bottom slot 0

  • SRX5600–Any slot except bottom slots 0 or 1

  • SRX5800–Any slot except center slots 0 or 1

Power requirement

Maximum of 570 W for the MPC with two MICs, including applicable transceivers.

Note:

  • To install and use SRX5K-MPCs in the SRX5600 and SRX5800 Services Gateways, you must have high-capacity power supplies (either AC or DC) and high-capacity fan trays installed in the services gateways. All models of SRX5400 Services Gateways already include high-capacity supplies. If you do not have high-capacity power supplies and fan trays installed, the services gateway will log an alarm condition when it recognizes the SRX5K-MPCs.

  • On SRX5400 and SRX5600 Services Gateways with AC power supplies, we recommend that you use high-line (220 V) input power to ensure that the devices have adequate power to support SRX5K-MPCs.

Weight

Approximately 10 lb (4.5 kg) without MICs

LEDs

OK/FAIL LED, one bicolor:

  • Green–The MPC is operating normally.

  • Blinking green–The MPC is transitioning to online or offline.

  • Red–The MPC has failed and is not operating normally.

  • Off–The MPC is powered down.

Serial number location

The serial number label is yellow and is located on the opposite side of the card.

MIC with 20x1GE SFP Interfaces (SRX-MIC-20GE-SFP)

You use Modular Interface Cards (MICs) and Modular Port Concentrators (MPCs) to add different combinations of Ethernet interfaces to your services gateway to suit the specific needs of your network.

The SRX-MIC-20GE-SFP MIC (see Figure 9) can be installed in the SRX-5K MPC to add twenty 1-Gigabit Ethernet small form-factor pluggable (SFP) Ethernet ports.

Figure 9: SRX-MIC-20GE-SFP
SRX-MIC-20GE-SFP

Description

  • MIC with twenty 1-Gigabit Ethernet SFP Ethernet ports

  • Fits into either of the two slots of SRX-5K-MPC

  • Supports up to 20 Gbps of full-duplex traffic

  • Maximum configurable MTU: 9192 bytes

  • Maximum throughput: 20 Gbps

Software release

Junos OS Release 12.1X47-D10

Cables and connectors

Sockets for 20 SFP Gigabit Ethernet transceivers.

Supported SFP transceivers:

1000BASE-LX (model numbers SRX-SFP-1GE-LX, SRX-SFP-1GE-LX-ET)

1000BASE-SX (model numbers SRX-SFP-1GE-SX, SRX-SFP-1GE-SX-ET)

1000BASE-T (model numbers SRX-SFP-1GE-T, SRX-SFP-1GE-T-ET)

Weight

Approximately 1.2 lb (0.54 kg)

LEDs

OK/FAIL LED, one bicolor:

  • Green–MIC is operating normally.

  • Red–MIC has failed.

  • Off–MIC is powered down.

LINK LED, single color, one per SFP port:

  • Green–Link is active.

  • Off–Link is inactive.

Port and Interface Numbering

Each MPC accepts up to two MICs. SRX-MIC-20GE-SFP is a 20-port Gigabit Ethernet MIC with SFP.

Each port on a MIC corresponds to a unique interface name in the CLI.

In the syntax of an interface name, a hyphen (-) separates the media type from the MPC number (represented as an FPC in the CLI). The MPC slot number corresponds to the first number in the interface. The second number in the interface corresponds to the logical PIC number. The last number in the interface matches the port number on the MIC. Slashes (/) separate the MPC number from the logical PIC number and port number:

type-fpc/pic/port

  • type—Media type, which identifies the network device. For example:

    • ge—Gigabit Ethernet interface

    • so—SONET/SDH interface

    • xe—10-Gigabit Ethernet interface

    For a complete list of media types, see Interface Naming Overview.

  • fpc—Slot in which the MPC is installed in an SRX5400, SRX5600, or SRX5800 Services Gateway.

  • pic—Two Logical PICs on the MIC, numbered 0 or 1 when installed in the first slot, and 2 or 3 when installed in the second slot.

  • port—Port number.

Figure 10 shows the SRX-MIC-20GE-SFP MIC installed in slot 0 of an MPC in slot 2 of an SRX5400, SRX5600, or SRX5800 Services Gateway.

Figure 10: SRX-MIC-20GE-SFP Interface Port Mapping
SRX-MIC-20GE-SFP Interface Port Mapping

The SRX-MIC-20GE-SFP MIC contains two logical PICs, numbered PIC 0 through PIC 1 in the CLI. Each logical PIC contains 10 ports numbered 0 through 9.

The sample output of the show chassis fpc pic-status command output displays two 20-port Gigabit Ethernet MICs with SFP — inserted into the slots of an MPC in slot 2.

The logical PICs of the two MICs— 10x 1GE(LAN) SFP — are shown as PIC 0, PIC 1, PIC 2, and PIC 3.

user@host> show chassis hardware
node1:
--------------------------------------------------------------------------
Slot 1   Online       SRX5k SPC II
  PIC 0  Online       SPU Cp
  PIC 1  Online       SPU Flow
  PIC 2  Online       SPU Flow
  PIC 3  Online       SPU Flow
Slot 2   Online       SRX5k IOC II
  PIC 0  Online       10x 1GE(LAN) SFP
  PIC 1  Online       10x 1GE(LAN) SFP
  PIC 2  Online       10x 1GE(LAN) SFP
  PIC 3  Online       10x 1GE(LAN) SFP

{primary:node1}

The show interfaces terse command output displays the Gigabit Ethernet interfaces that correspond to all the ports located on the two MICs.

user@host> show interfaces terse


Interface Admin Link Proto Local Remote gr-0/0/0 up up ip-0/0/0 up up lt-0/0/0 up up ge-2/0/0 up up ge-2/0/1 up down ge-2/0/2 up down ge-2/0/3 up down ge-2/0/4 up down ge-2/0/5 up up ge-2/0/6 up down ge-2/0/7 up down ge-2/0/8 up up ge-2/0/9 up up ge-2/1/0 up down ge-2/1/1 up up ge-2/1/2 up down ge-2/1/3 up down ge-2/1/4 up up ge-2/1/5 up down ge-2/1/6 up down ge-2/1/7 up down ge-2/1/8 up up ge-2/1/9 up up ge-2/2/0 up down ge-2/2/1 up down ge-2/2/2 up down ge-2/2/3 up down ge-2/2/4 up down ge-2/2/5 up down ge-2/2/6 up down ge-2/2/7 up down ge-2/2/8 up down ge-2/2/9 up down ge-2/3/0 up down ge-2/3/1 up down ge-2/3/2 up down ge-2/3/3 up down ge-2/3/4 up down ge-2/3/5 up down ge-2/3/6 up down ge-2/3/7 up down ge-2/3/8 up down ge-2/3/9 up down

Serial number location

The serial number label is yellow and is located as shown in Figure 11.

Figure 11: SRX-MIC-20GE-SFP Serial Number Label
SRX-MIC-20GE-SFP Serial Number
Label

Note: The serial number for the mezzanine card is shown only for reference and is never used for any purpose.

MIC with 10x10GE SFP+ Interfaces (SRX-MIC-10XG-SFPP)

You use MICs and MPCs to add different combinations of Ethernet interfaces to your services gateway to suit the specific needs of your network. The SRX-MIC-10XG-SFPP (see Figure 12) can be installed in an MPC to add ten 10-Gigabit Ethernet SFP+ ports.

Figure 12: SRX-MIC-10XG SFPP
SRX-MIC-10XG SFPP

Description

  • MIC with ten SFP+ 10-Gigabit Ethernet ports

  • Fits into MPC

  • Supports up to 100 Gbps of full-duplex traffic

  • Maximum configurable MTU: 9192 bytes

  • Maximum throughput: 100 Gbps

Software release

Junos OS Release 12.1X46-D10

Cables and connectors

Sockets for ten 10-Gbps SFP+ transceivers

See Hardware Compatibility Tool for the transceivers supported.

Supported slots

Either slot in SRX5K-MPC

Weight

Approximately 1.6 lb (0.7 kg)

LEDs

OK/FAIL LED, one bicolor:

  • Green–The MIC is operating normally.

  • Red–The MIC has failed and is not operating normally.

  • Off–The MIC is powered down.

LINK LED, single color:

  • Green–The link is active.

  • Off–No link.

Port and Interface Numbering

Each port on a MIC corresponds to a unique interface name in the CLI.

In the syntax of an interface name, a hyphen (-) separates the media type from the MPC number (represented as an FPC in the CLI). The MPC slot number corresponds to the first number in the interface. The second number in the interface corresponds to the logical PIC number. The last number in the interface matches the port number on the MIC. Slashes (/) separate the MPC number from the logical PIC number and port number:

type-fpc/pic/port

  • type—Media type, which identifies the network device. For example:

    • ge—Gigabit Ethernet interface

    • so—SONET/SDH interface

    • xe—10-Gigabit Ethernet interface

    For a complete list of media types, see Interface Naming Overview.

  • fpc—Slot in which the MPC is installed in an SRX5400, SRX5600, or SRX5800 Services Gateway.

  • pic—Logical PIC on the MIC, numbered 0 when installed in the first slot or 2 when installed in the second slot.

  • port—Port number.

Figure 13 shows the port and interface numbering of an SRX-MIC-10XG-SFPP MIC when it is installed in slot 0 of an MPC in slot 2 of an SRX5400, SRX5600, or SRX5800 Services Gateway.

Figure 13: SRX-MIC-10XG-SFPP Port and Interface Numbering
SRX-MIC-10XG-SFPP Port and Interface Numbering

The SRX-MIC-10XG-SFPP MIC contains one logical PIC, numbered PIC 0 in the CLI when inserted in the first slot of the MPC or PIC 2 when inserted in the second slot of the MPC. Each logical PIC contains 10 ports numbered 0 through 9.

The sample output of the show chassis fpc pic-status command displays two 10-port 10-Gigabit Ethernet MICs with SFP+ — inserted into the slots of an MPC in slot 2.

The logical PICs of the two MICs— 10x 10GE SFP+ — are shown as PIC 0 and PIC 2.

user@host> show chassis fpc pic-status


Slot 1 Online SRX5k SPC II PIC 0 Online SPU Cp PIC 1 Online SPU Flow PIC 2 Online SPU Flow PIC 3 Online SPU Flow Slot 2 Online SRX5k IOC II PIC 0 Online 10x 10GE SFP+ PIC 2 Online 10x 10GE SFP+

The show interfaces terse command output displays the 10–Gigabit Ethernet interfaces that correspond to the 10 ports located on each MIC.

user@host> show interfaces terse


Interface Admin Link Proto Local Remote gr-0/0/0 up up ip-0/0/0 up up lt-0/0/0 up up xe-2/0/0 up up xe-2/0/1 up up xe-2/0/2 up up xe-2/0/2.0 up up inet 131.131.131.2/24 inet6 1300::2/64 fe80::224:dcff:fe20:b94c/64 multiservice xe-2/0/3 up up xe-2/0/4 up up xe-2/0/5 up up xe-2/0/6 up up xe-2/0/6.0 up up inet 141.141.141.1/24 inet6 1400::1/64 fe80::224:dcff:fe20:b950/64 multiservice xe-2/0/7 up down xe-2/0/8 up down xe-2/0/9 up down xe-2/2/0 up down xe-2/2/1 up down xe-2/2/2 up down xe-2/2/3 up down xe-2/2/4 up down xe-2/2/5 up down xe-2/2/6 up down xe-2/2/7 up down xe-2/2/8 up down xe-2/2/9 up down

Serial number location

The serial number label is yellow and located as shown in Figure 14.

Figure 14: SRX-MIC-10XG-SFPP Serial Number Label
SRX-MIC-10XG-SFPP Serial
Number Label

MIC with 1x100GE CFP Interface (SRX-MIC-1X100G-CFP)

You use MICs and MPCs to add different combinations of Ethernet interfaces to your services gateway to suit the specific needs of your network. The SRX-MIC-1X100G-CFP (see Figure 15) can be installed in an MPC to add one 100-Gigabit Ethernet CFP port.

Figure 15: SRX-MIC-1X100G-CFP
 SRX-MIC-1X100G-CFP

Description

  • MIC with one CFP 100-Gigabit Ethernet port

  • Fits into MPC

  • Supports up to 100 Gbps of full-duplex traffic

  • Maximum configurable MTU: 9192 bytes

  • Maximum throughput: 100 Gbps

Software release

Junos OS Release 12.1X46-D10

Cables and connectors

One socket for a 100-Gigabit CFP transceiver.

Supported CFP transceivers:

  • 100GBASE-LR4 (model number: SRX-CFP-100G-LR4)

  • 100GBASE-SR10 (model number: SRX-CFP-100G-SR10)

Supported slots

Either slot in SRX5K-MPC

Weight

Approximately 1.6 lb (0.7 kg)

LEDs

OK/FAIL LED, one bicolor:

  • Green–The MIC is operating normally.

  • Red–The MIC has failed and is not operating normally.

  • Off–The MIC is powered down.

LINK LED, single color:

  • Green–The link is active.

  • Off–No link.

Serial number location

The serial number label is yellow and located as shown in Figure 16.

Figure 16: SRX-MIC-1X100G-CFP Serial Number Label
SRX-MIC-1X100G-CFP Serial
Number Label

MIC with 2x40GE QSFP+ Interfaces (SRX-MIC-2X40G-QSFP)

You use MICs and MPCs to add different combinations of Ethernet interfaces to your services gateway to suit the specific needs of your network. The SRX-MIC-2X40G-QSFP (see Figure 17) can be installed in an MPC to add two 40-Gigabit quad small form-factor pluggable (QSFP+) Ethernet ports.

Figure 17: SRX-MIC-2X40G QSFP
 SRX-MIC-2X40G QSFP

Description

  • MIC with two QSFP+ Ethernet ports

  • Fits into MPC

  • Supports up to 80 Gbps of full-duplex traffic

  • Maximum configurable MTU: 9192 bytes

  • Maximum throughput: 80 Gbps

Software release

Junos OS Release 12.1X46-D10

Cables and connectors

Sockets for two QSFP+ 40-Gigabit Ethernet fiber-optic transceivers.

Supported QSFP+ transceiver:

40GBASE-SR4 (model number SRX-QSFP-40G-SR4)

40GBASE-LR4 (model number SRX-QSFP-40G-LR4)

Supported slots

Either slot in SRX5K-MPC

Weight

Approximately 1.6 lb (0.7 kg)

LEDs

OK/FAIL LED, one bicolor:

  • Green–The MIC is operating normally.

  • Red–The MIC has failed and is not operating normally.

  • Off–The MIC is powered down.

LINK LED, single color, one per QSFP+ port:

  • Green–The link is active.

  • Off–No link.

Serial number location

The serial number label is yellow and typically located as shown in Figure 18.

Figure 18: SRX-MIC-2X40G-QSFP Serial Number Label
SRX-MIC-2X40G-QSFP Serial
Number Label

SRX5K-MPC3-40G10G Specifications

The SRX5K-MPC3-40G10G (IOC3) is an interface card that provides 10 Gigabit Ethernet and 40 Gigabit Ethernet interfaces, with a Packet Forwarding Engine that provides a 240 Gbps line rate. This interface card is supported on SRX5400, SRX5600, and SRX5800 Services Gateways. See Figure 19.

Note

These cards do not support plug-in Modular Interface Cards (MICs).

All ports on the interface card have dual-color LEDs for reporting link status.

The interface card also supports hot-pluggable optical modules.

Figure 19: SRX5K-MPC3-40G10G
SRX5K-MPC3-40G10G

If a slot in the SRX5400, SRX5600, or SRX5800 Services Gateway card cage is not occupied by a card, you must install a blank panel to shield the empty slot and to allow cooling air to circulate properly through the services gateway.

Description

  • Fixed-configuration MPC with six 40-Gigabit Ethernet ports and twenty-four 10-Gigabit Ethernet ports

  • Maximum throughput: 240 Gbps

  • Maximum configurable MTU: 9192 bytes

Software release

Junos OS Release 15.1X49-D10 and later

Supported Slots

  • SRX5400 – Any slot, except the bottom slot 0 which is reserved for SCB/RE.

  • SRX5600 – Any slot, except the bottom slots 0 or 1 which are reserved for SCB/RE.

  • SRX5800 – Any slot, except the middle slots 0, 1, and 2/6 which are reserved for SCB/RE and slots 0 (left most) and 11 (right most).

    Note: You can use the 2/6 slot to install an interface card if an SCB is not already installed in it.

Cables and connectors

Sockets for 40-Gbps and 10-Gbps SFP+ transceivers

See Hardware Compatibility Tool for the transceivers supported.

Power requirements

Typical: 9.68 A @ 48 V (460 W)

At different temperatures:

  • 55° C: 607 W

  • 40° C: 541 W

  • 25° C: 511 W

Weight

21 lb (9.52 kg)

Hardware features

  • Line-rate throughput of up to 240 Gbps

  • Supports up to 32,000 queues per-slot

  • LAN-PHY mode at 10.3125 Gbps on a per-port basis

  • The ports are labeled as:

    • 10-Gigabit Ethernet ports: 0/0 through 0/11 and 1/0 through 1/11

    • 40-Gigabit Ethernet ports: 2/0 through 2/2 and 3/0 through 3/2

Software features

  • Optical diagnostics and related alarms

  • Two packet-forwarding engines, PFE0 and PFE1. PFE0 hosts PIC0 and PIC2. PFE1 hosts PIC1 and PIC3.

  • Configurable LAN-PHY mode options per 10-Gigabit Ethernet port

  • Intelligent oversubscription services

Note: At any one time you can have only one of the following PIC combinations powered on:

  • PIC0 & PIC1

  • PIC0 & PIC3

  • PIC2 & PIC1

  • PIC2 & PIC3

If you configure any of the following invalid PIC combinations, the chassis will set PIC0 & PIC1 combination online.

  • PIC0 & PIC2

  • PIC1 & PIC3

LEDs

OK/FAIL LED, one bicolor:

  • Solid green—MPC is functioning normally.

  • Blinking green—MPC is transitioning online or offline.

  • Red—MPC has failed.

10-Gigabit Ethernet LINK LED, one green per port:

  • Green—Link is up.

  • Off—Link is down or disabled.

40-Gigabit Ethernet LINK LED, one bicolor per port:

  • Green—Link is up.

  • Amber—Link is disabled.

  • Off—Link is down.

Serial Number Location

The serial number label is located as shown in Figure 20.

Figure 20: SRX5K-MPC3-40G10G Serial Number Label
SRX5K-MPC3-40G10G Serial Number Label

SRX5K-MPC3-100G10G Specifications

The SRX5K-MPC3-100G10G (IOC3) is an interface card that provides 100 Gigabit Ethernet and 10 Gigabit Ethernet interfaces, with a Packet Forwarding Engine that provides a 240 Gbps line rate. This interface card is supported on SRX5400, SRX5600, and SRX5800 Services Gateways. See Figure 21.

Note

These cards do not support plug-in Modular Interface Cards (MICs).

All ports on the interface card have dual-color LEDs for reporting link status.

The interface card also supports hot-pluggable optical modules.

Figure 21: SRX5K-MPC3-100G10G
SRX5K-MPC3-100G10G

If a slot in the SRX5400, SRX5600, or SRX5800 Services Gateway card cage is not occupied by a card, you must install a blank panel to shield the empty slot and to allow cooling air to circulate properly through the services gateway.

Description

  • Fixed-configuration MPC with two 100-Gigabit Ethernet ports and four 10-Gigabit Ethernet ports

  • Maximum throughput: 240 Gbps

  • Maximum configurable MTU: 9192 bytes

Software release

Junos OS Release 15.1X49-D10 and later

Supported Slots

  • SRX5400 – Any slot, except the bottom slot 0 which is reserved for SCB/RE.

  • SRX5600 – Any slot, except the bottom slots 0 or 1 which are reserved for SCB/RE.

  • SRX5800 – Any slot, except the middle slots 0, 1, and 2/6 which are reserved for SCB/RE and slots 0 (left most) and 11 (right most).

    Note: You can use the 2/6 slot to install an interface card if an SCB is not already installed in it.

Cables and connectors

Sockets for 100-Gbps and 10-Gbps SFP+ transceivers

See Hardware Compatibility Tool for the transceivers supported.

Power requirements

  • Typical: 10.52 A @ 48 V (505 W)

At different temperatures:

  • 55° C: 607 W

  • 40° C: 541 W

  • 25° C: 511 W

Weight

21 lb (9.52 kg)

Hardware features

  • Line-rate throughput of up to 240 Gbps

  • Supports up to 32,000 queues per-slot

  • LAN-PHY mode at 10.3125 Gbps on a per-port basis

The ports are labeled as:

  • 10-Gigabit Ethernet ports: 0/0, 0/1, 2/0, and 2/1

  • 100-Gigabit Ethernet ports: 1/0 and 3/0

Software features

  • Configurable LAN-PHY mode options per 10-Gigabit Ethernet port

  • Optical diagnostics and related alarms

  • Intelligent oversubscription services

LEDs

OK/FAIL LED, one bicolor:

  • Solid green—MPC is functioning normally.

  • Blinking green—MPC is transitioning online or offline.

  • Red—MPC has failed.

10-Gigabit Ethernet LINK LED, one bicolor per port:

  • Green—Link is up.

  • Amber—Link is disabled.

  • Off—Link is down or disabled.

100-Gigabit Ethernet LINK LED, one bicolor per port:

  • Green—Link is up.

  • Amber—Link is disabled.

  • Off—Link is down.

Serial Number Location

The serial number label is located as shown in Figure 22.

Figure 22: SRX5K-MPC3-100G10G Serial Number Label
SRX5K-MPC3-100G10G Serial Number Label

SRX5800 Services Gateway Interface Card Description

Interface cards are cards that support physical interfaces that you use to connect the services gateway to your data network. Three different types of interface cards are available:

  • I/O Cards (IOCs) have fixed interface ports on the front panel of the card.

  • Flex I/O Cards (Flex IOCs) have slots on the front panel that accept smaller cards called port modules. Each port module has two or more physical interfaces on it. A Flex IOC with installed port modules functions in the same way as a regular IOC, but allows greater flexibility in adding different types of Ethernet ports to your services gateway.

  • Modular Port Concentrators (MPCs) have slots on the front panel that accept smaller cards called Modular Interface Cards (MICs). Each MIC has one or more physical interface on it. An MPC with MICs installed functions in the same way as a regular I/O card (IOC), but allows greater flexibility in adding different types of Ethernet ports to your services gateway. MPCs and MICs are similar in form and function to Flex IOCs and port modules. However, the two use different form-factors, so you cannot install port modules in an MPC, nor can you install MICs in a Flex IOC.

For all interface card types, the card assembly combines packet forwarding and Ethernet interfaces on a single board. The interface cards interface with the power supplies and Switch Control Boards (SCBs).

You can install interface cards in any of the slots that are not reserved for SCBs. If a slot is not occupied by a card, you must install a blank panel to shield the empty slot and to allow cooling air to circulate properly through the services gateway.

Figure 23 shows typical IOCs supported on the services gateway.

Figure 23: Typical IOCs
Typical IOCs

Figure 24 shows a Flex IOC with two typical port modules installed.

Figure 24: Flex IOC with Port Modules
Flex IOC with Port Modules

Figure 25 shows an MPC.

Figure 25: SRX5K-MPC
SRX5K-MPC

For detailed information about the interface cards, port modules, and MICs supported by the services gateway, see the SRX5400, SRX5600, and SRX5800 Services Gateway Card Reference   at www.juniper.net/documentation/.

I/O Card SRX5K-40GE-SFP Specifications

The SRX5K-40GE-SFP I/O card (IOC) is optimized for Ethernet density and supports 40 Gigabit Ethernet ports (see Figure 26). The IOC assembly combines packet forwarding and Ethernet interfaces on a single board, with four 10-Gbps Packet Forwarding Engines. Each Packet Forwarding Engine consists of one I-chip for Layer 3 processing and one Layer 2 network processor. The IOCs interface with the power supplies and Switch Control Boards (SCBs).

You must install at least one IOC in the services gateway. The IOC can be of any of the available IOC or Flex IOC types.

You can install IOCs in any of the slots that are not reserved for Switch Control Boards (SCBs). If a slot is not occupied by a card, you must install a blank panel to shield the empty slot and to allow cooling air to circulate properly through the services gateway.

Figure 26: IOC SRX5K-40GE-SFP
IOC SRX5K-40GE-SFP

Description

  • I/O card with 40 Gigabit Ethernet SFP ports

  • Maximum configurable MTU: 9192 bytes

  • Maximum throughput: 40 Gbps

Software release

  • Junos OS Release 9.2 and later

Cables and connectors

40 Gigabit Ethernet SFP ports

Supported SFP transceivers:

1000BASE-LH (model numbers SRX-SFP-1GE-LH, SRX-SFP-1GE-LH-ET)

1000BASE-LX (model numbers SRX-SFP-1GE-LX, SRX-SFP-1GE-LX-ET)

1000BASE-SX (model numbers SRX-SFP-1GE-SX, SRX-SFP-1GE-SX-ET)

1000BASE-T (model numbers SRX-SFP-1GE-T, SRX-SFP-1GE-T-ET)

Controls

None

Supported Slots

  • SRX5600–Any slot except bottom slots 0 or 1

  • SRX5800–Any slot except center slots 0, 1, or 2/6

Power Requirement

312 W typical, 365 W maximum

Weight

Approximately 13 lb (5.9 kg)

LEDs

OK/FAIL LED, one bicolor:

  • Steady green–The IOC is operating normally.

  • Red–The IOC has failed and is not operating normally.

  • Off–The IOC is powered down.

Serial Number Location

The serial number label is located as shown in Figure 27.

Figure 27: Serial Number Label (IOC Shown, Other Cards Similar)
Serial Number Label (IOC Shown,
Other Cards Similar)

I/O Card SRX5K-4XGE-XFP Specifications

The SRX5K-4XGE-XFP I/O card (IOC) supports four 10-Gigabit Ethernet ports (see Figure 28). The IOC assembly combines packet forwarding and Ethernet interfaces on a single board, with four 10-Gbps Packet Forwarding Engines. Each Packet Forwarding Engine consists of one I-chip for Layer 3 processing and one Layer 2 network processor. The IOCs interface with the power supplies and Switch Control Boards (SCBs).

You must install at least one IOC in the services gateway. The IOC can be of any of the available IOC or Flex IOC types.

You can install IOCs in any of the slots that are not reserved for Switch Control Boards (SCBs). If a slot is not occupied by a card, you must install a blank panel to shield the empty slot and to allow cooling air to circulate properly through the services gateway.

Figure 28: IOC SRX5K-4XGE-XFP
IOC SRX5K-4XGE-XFP

Description

  • I/O card with four 10-Gigabit Ethernet XFP ports

  • Maximum configurable MTU: 9192 bytes

  • Maximum throughput: 40 Gbps

Software release

  • Junos OS Release 9.2 and later

Cables and connectors

Four 10-Gbps XFP ports

Supported XFP transceivers:

10GBASE-ER (model numbers SRX-XFP-10GE-ER and SRX-XFP-10GE-ER-ET )

10GBASE-LR (model numbers SRX-XFP-10GE-LR and SRX-XFP-10GE-LR-ET

10GBASE-SR (model numbers SRX-XFP-10GE-SR and SRX-XFP-10GE-SR-ET )

Controls

None

Supported Slots

  • SRX5600–Any slot except bottom slots 0 or 1

  • SRX5800–Any slot except center slots 0, 1, or 2

Power Requirement

312 W typical, 365 W maximum

Weight

Approximately 13 lb (5.9 kg)

LEDs

OK/FAIL LED, one bicolor:

  • Steady green–The IOC is operating normally.

  • Red–The IOC has failed and is not operating normally.

  • Off–The IOC is powered down.

Serial Number Location

The serial number label is located as shown in Figure 29.

Figure 29: SRX5K-4XGE-XFP Serial Number Label
SRX5K-4XGE-XFP Serial Number Label

Flex I/O Card (SRX5K-FPC-IOC) Specifications

The SRX5K-FPC-IOC Flex I/O card (Flex IOC) (Figure 30) is an IOC with two slots that accept port modules that add Ethernet ports to your services gateway. A Flex IOC with installed port modules functions in the same way as a regular IOC, but allows greater flexibility in adding different types of Ethernet ports to your services gateway.

Each Flex IOC has a processor subsystem, which includes a 1.2-GHz CPU, a system controller, 1 GB SDRAM, and two Packet Forwarding Engines with a maximum throughput of 10 Gbps each.

You must install at least one IOC in the services gateway. The IOC can be of any of the available IOC or Flex IOC types.

You can install Flex IOCs in any of the slots that are not reserved for Switch Control Boards (SCBs). If a slot is not occupied by a card, you must install a blank panel to shield the empty slot and to allow cooling air to circulate properly through the services gateway.

Figure 30: Flex IOC with Typical Port Modules
Flex IOC with Typical Port Modules

Description

  • Flex IOC with slots for two port modules

  • Maximum throughput: 10 Gbps (per PFE)

Software release

  • Junos OS Release 9.5R1 and later

Cables and connectors

Slots for two port modules

Controls

None

Supported Slots

  • SRX5600–Any slot except bottom slots 0 or 1

  • SRX5800–Any slot except center slots 0, 1, or 2/6

Power Requirement

312 W typical, 365 W maximum (includes port modules)

Weight

Approximately 10 lb (4.5 kg)

LEDs

OK/FAIL LED, one bicolor:

  • Steady green–The Flex IOC is operating normally.

  • Red–The Flex IOC has failed and is not operating normally.

  • Off–The Flex IOC is powered down.

Serial Number Location

The serial number label is located as shown in Figure 31.

Figure 31: Serial Number Label (IOC Shown, Other Cards Similar)
Serial Number Label (IOC Shown,
Other Cards Similar)

Flex I/O Card Port Module SRX-IOC-16GE-SFP Specifications

You use port modules and Flex I/O Cards (Flex IOCs) to add different combinations of small form-factor pluggable transceiver (SFP), 10-gigabit SFP transceiver (XFP), and copper ports to your services gateway to suit the specific needs of your network. The SRX-IOC-16GE-SFP port module (Figure 32) installs into a Flex IOC to add sixteen 10/100/1000 Ethernet SFP ports.

Figure 32: Flex IOC Port Module SRX-IOC-16GE-SFP
 Flex IOC Port Module SRX-IOC-16GE-SFP

Description

  • Port module with 16 Gigabit Ethernet SFP ports

  • Maximum throughput: 10 Gbps

  • Oversubscription ratio: 1.6:1

  • Maximum configurable MTU: 9192 bytes

Software release

  • Junos OS Release 9.5R1 and later

Cables and connectors

16 Gigabit Ethernet SFP ports

Supported SFP transceivers:

1000BASE-LH (model numbers SRX-SFP-1GE-LH, SRX-SFP-1GE-LH-ET)

1000BASE-LX (model numbers SRX-SFP-1GE-LX, SRX-SFP-1GE-LX-ET)

1000BASE-SX (model numbers SRX-SFP-1GE-SX, SRX-SFP-1GE-SX-ET)

1000BASE-T (model numbers SRX-SFP-1GE-T, SRX-SFP-1GE-T-ET)

Controls

ONLINE Button–The ONLINE button on the port module front panel toggles the port module online and offline

Supported Slots

Either slot in SRX5K-FPC-IOC Flex IOC

Weight

Approximately 1.6 lb (0.7 kg)

LEDs

OK/FAIL LED, one bicolor:

  • Steady green–The port module is operating normally.

  • Red–The port module has failed and is not operating normally.

  • Off–The port module is powered down.

LINK LED, single color, one per port:

  • Steady green–The link is active.

  • Off–No link.

TX/RX LED, single color, one per port:

  • Blinking Green–The port is receiving or transmitting data.

  • Off–No activity.

Serial Number Location

The serial number label is located as shown in Figure 33.

Figure 33: Port Module SRX-IOC-16GE-SFP Serial Number Label
Port Module  SRX-IOC-16GE-SFP
Serial Number Label

Flex I/O Card Port Module SRX-IOC-16GE-TX Specifications

You use port modules and Flex I/O Cards (Flex IOCs) to add different combinations of small form-factor pluggable transceiver (SFP), 10-gigabit SFP transceiver (XFP), and copper ports to your services gateway to suit the specific needs of your network. The SRX-IOC-16GE-TX port module (Figure 34) installs into a Flex IOC to add sixteen 10/100/1000 Ethernet RJ-45 copper ports.

Figure 34: Flex IOC Port Module SRX-IOC-16GE-TX
 Flex IOC Port Module SRX-IOC-16GE-TX

Description

  • Port module with sixteen 10/100/1000 Ethernet RJ45 ports

  • Maximum throughput: 10 Gbps

  • Oversubscription ratio: 1.6:1

  • Maximum configurable MTU: 9192 bytes

Software release

  • Junos OS Release 9.5R1 and later

Cables and connectors

Sixteen RJ-45 1-Gbps ports

Controls

ONLINE Button–The ONLINE button on the port module front panel toggles the port module online and offline.

Supported Slots

Either slot in SRX5K-FPC-IOC Flex IOC

Weight

Approximately 1.6 lb (0.7 kg)

LEDs

OK/FAIL LED, one bicolor:

  • Steady green–The port module is operating normally.

  • Red–The port module has failed and is not operating normally.

  • Off–The port module is powered down.

LINK LED, single color, one per port:

  • Steady green–The link is active.

  • Off–No link.

TX/RX LED, single color, one per port:

  • Blinking green–The port is receiving or transmitting data.

  • Off–No activity.

Serial Number Location

The serial number label is located as shown in Figure 35.

Figure 35: Port Module SRX-IOC-16GE-TX Serial Number Label
Port Module SRX-IOC-16GE-TX
Serial Number Label

Flex I/O Card Port Module SRX-IOC-4XGE-XFP Specifications

You use port modules and Flex I/O Cards (Flex IOCs) to add different combinations of small form-factor pluggable transceiver (SFP), 10-gigabit SFP transceiver (XFP), and copper ports to your services gateway to suit the specific needs of your network. The SRX-IOC-4XGE-XFP port module (Figure 36) installs into a Flex IOC to add four 10-Gigabit Ethernet XFP ports.

Figure 36: Flex IOC Port Module SRX-IOC-4XGE-XFP
 Flex IOC Port Module SRX-IOC-4XGE-XFP

Description

  • Port module with four 10-Gigabit Ethernet XFP ports

  • Maximum throughput: 10 Gbps

  • Oversubscription ratio: 4:1

  • Maximum configurable MTU: 9192 bytes

Software release

  • Junos OS Release 9.5R1 and later

Cables and connectors

4 XFP Ethernet ports

Supported XFP transceivers:

10GBASE-ER (model numbers SRX-XFP-10GE-ER and SRX-XFP-10GE-ER-ET )

10GBASE-LR (model numbers SRX-XFP-10GE-LR and SRX-XFP-10GE-LR-ET

10GBASE-SR (model numbers SRX-XFP-10GE-SR and SRX-XFP-10GE-SR-ET )

Controls

ONLINE Button–The ONLINE button on the port module front panel toggles the port module online and offline

Supported Slots

Either slot in SRX5K-FPC-IOC Flex IOC

Weight

Approximately 1.6 lb (0.7 kg)

LEDs

OK/FAIL LED, one bicolor:

  • Steady green–The port module is operating normally.

  • Red–The port module has failed and is not operating normally.

  • Off–The port module is powered down.

LINK LED, single color, one per port:

  • Steady green–The link is active.

  • Off–No link.

Serial Number Location

The serial number label is located as shown in Figure 37.

Figure 37: Port Module SRX-IOC-4XGE-XFP Serial Number Label
Port Module  SRX-IOC-4XGE-XFP
Serial Number Label