Escali 4.4 manual Using SMCs built-in CPU-usage functionality, This produces the following report

Page 57

Section: 4.5 Using SMC's built-in CPU-usage functionality

4.5 Using SMC's built-in CPU-usage functionality

Scali MPI Connect has the capability to report wall clock time, and user and system CPU time on all processes with a built-in CPU timing facility. To use SMC's built-in CPU-usage-timing it is necessary first to set the environment variable SCAMPI_CPU_USAGE.

The information displayed is collected with the system-call "times"; see man-pages for more information.

The output has two different blocks. The first block contains CPU-usage by the sub monitors on the different nodes. One line is printed for each sub monitor followed by a sum-line and an average-line. The second block consists of one line per process followed by a sum-line and an average-line.

For example, to get the CPU usage when running the image enhancement program do:

user% SCAMPI_CPU_USAGE=1 mpirun -np 4 ./kollektive-8 ./uf256-8.pgm

This produces the following report:

Submonitor timing stat. in secs

---------- Own ----------- ------ Own+Children ------

Elapsed

User

System

Sum

User

System

Sum

Submonitor-1@r9

2.970

0.000

0.000

 

0.000

0.090

0.030

0.120

Submonitor-2@r8

3.250 -0.000

0.000

 

-0.000

0.060

0.040

0.100

Submonitor-3@r7

3.180 -0.000 -0.000

-0.000

0.050

0.030

0.080

Submonitor-4@r6

3.190

0.010

0.000

 

0.010

0.090

0.020

0.110

Total for submonitors

12.590

0.010

-0.000

0.010

0.290

0.120

0.410

Average per submonitor

3.147

0.003

-0.000

0.003

0.073

0.030

0.103

Process

timing stat. in secs

Elapsed

----------

Own -----------

User

System

Sum

kollektive-8-0@r9

0.080

0.070

0.030

0.100

kollektive-8-1@r8

0.050

0.020

0.040

0.060

kollektive-8-2@r7

0.050

0.020

0.030

0.050

kollektive-8-3@r6

0.010

0.020

0.020

0.040

Sum for

processes

0.190

0.130

0.120

0.250

Average

per process

0.048

0.033

0.030

0.062

Elapsed

is walltime used by user-process/submonitor

 

 

User is

cpu-time used in user-process/submonitor

 

 

System is cpu-time used in system-calls

Sum is total cpu-time used by user-process/submonitor

Scali MPI Connect Release 4.4 Users Guide

45

Image 57
Contents Scali MPI ConnectTM Users Guide Copyright 1999-2005 Scali AS. All rights reserved AcknowledgementScali Bronze Software Certificate Maintenance II Software License Terms CommencementGrant of License License Manager SupportSub-license and distribution Export RequirementsLICENSEE’s Obligations SCALI’s ObligationsTransfer Title to Intellectual Property RightsCompliance with Licenses Warranty of Title and Substantial PerformanceScali MPI Connect Release 4.4 Users Guide Vii Limitation on Remedies and LiabilitiesMiscellaneous Proprietary InformationGoverning Law Scali MPI Connect Release 4.4 Users Guide Table of contents Profiling with Scali MPI Connect Appendix a Example MPI code Scali MPI Connect Release 4.4 Users Guide Scali MPI Connect product context ChapterProblem reports Scali mailing lists SMC FAQ SMC release documentsSupport Platforms supportedAcronyms and abbreviations How to read this guideLicensing FeedbackNIC Terms and conventions Typographic conventionsGUI style font Typographic conventions Scali MPI Connect components Description of Scali MPI ConnectSMC network devices Network devices Direct Access Transport DATShared Memory Device Ethernet DevicesUsing detctl Using detstat3.2 DET Infiniband Myrinet4.1 GM 5.1 IBCommunication protocols on DAT-devices 6 SCIChannel buffer Inlining protocol Eagerbuffering protocolTransporter protocol MPI-2 Features Support for other interconnectsZerocopy protocol Scali MPI Connect Release 4.4 Users Guide MPI-2 Features Compiling and linking Setting up a Scali MPI Connect environmentScali MPI Connect environment variables RunningLinker flags Compiler supportNaming conventions Running Scali MPI Connect programsMpimon monitor program Basic usageIdentity of parallel processes Standard input Controlling options to mpimonStandard output Program specNetwork options How to provide options to mpimonMpirun usage Mpirun wrapper scriptSuspending and resuming jobs Running with tcp error detection TfdrRunning with dynamic interconnect failover capabilities Part partDebugging with a sequential debugger Debugging and profilingBuilt-in-tools for debugging Using built-in segment protect violation handlerAssistance for external profiling Debugging with Etnus TotalviewControlling communication resources Communication resources on DAT-devicesChannelinlinethreshold size to set threshold for inlining Using MPIBsend Using MPIIsend, MPIIrecvGood programming practice with SMC Matching MPIRecv with MPIProbeUser interface errors and warnings Error and warning messagesFatal errors Unsafe MPI programsMpimon options Giving numeric values to mpimon PrefixPostfix Scali MPI Connect Release 4.4 Users Guide Example Profiling with Scali MPI ConnectTracing Using Scali MPI Connect built-in trace+relSecs S eTime where AbsRank MPIcallcommNamerankcall-dependant-parameters whereExample FeaturesTiming Using Scali MPI Connect built-in timingMPIcallDcallsDtimeDfreq TcallsTtimeTfreq Commrank recv from fromworldFromcommonFields Using the scanalyzeCommrank send to toworldTocommonFields where Count!avrLen!zroLen!inline!eager!transporter! whereFor timing This produces the following report Using SMCs built-in CPU-usage functionalityScali MPI Connect Release 4.4 Users Guide Automatic buffer management Tuning communication resourcesBenchmarking How to optimize MPI performanceCaching the application program on the nodes First iteration is very slowMemory consumption increase after warm-up Collective operationsFinding the best algorithm Appendix a Programs in the ScaMPItst packageImage contrast enhancement Scali MPI Connect Release 4.4 Users Guide Original File formatWhen things do not work troubleshooting Why does not my program start to run?Appendix B Why can I not start mpid? Why does my program terminate abnormally?General problems Appendix C Per node installation of Scali MPI ConnectInstall Scali MPI Connect for Direct Ethernet Install Scali MPI Connect for TCP/IPInstall Scali MPI Connect for Myrinet ExampleInstall Scali MPI Connect for SCI Install Scali MPI Connect for InfinibandInstall and configure SCI management software License optionsUninstalling SMC Troubleshooting Network providersScali kernel drivers Troubleshooting the GM provider Troubleshooting 3rdparty DAT providersScali MPI Connect Release 4.4 Users Guide Appendix D Bracket expansion and grouping Bracket expansionGrouping Scali MPI Connect Release 4.4 Users Guide Related documentation Appendix EScali MPI Connect Release 4.4 Users Guide List of figures Scali MPI Connect Release 4.4 Users Guide Transporter protocol IndexSSP