Escali 4.4 manual For timing

Page 56

Section: 4.4 Using the scanalyze

user% SCAMPI_TIMING=”-s 10” mpimon ./all2all -- r1 r2

produced a 158642 byte file

Digesting the massive information in these files is a challenge, but scanalyze produces the following summaries for tracing:

 

Count

Total

< 128

< 1k

< 8k

< 256k

< 1M

--------------- --------

-------- -------- -------- -------- -------- --------

MPI_Alltoall

24795

5127

3078

4104

10260

2226

0

MPI_Barrier

52

0

0

0

0

0

0

MPI_Comm_rank

2

0

0

0

0

0

0

MPI_Comm_size

2

0

0

0

0

0

0

MPI_Init

2

0

0

0

0

0

0

MPI_Keyval_free

2

0

0

0

0

0

0

MPI_Wtime

102

0

0

0

0

0

0

 

Timing

Total

< 128

< 1k

< 8k

< 256k

< 1M

---------------MPI_Alltoall

--------

------- -------- -------- -------- -------- --------

21.20

0.21

0.15

0.41

9.35

11.08

0.00

MPI_Barrier

0.01

0.00

0.00

0.00

0.00

0.00

0.00

MPI_Comm_rank

0.00

0.00

0.00

0.00

0.00

0.00

0.00

MPI_Comm_size

0.00

0.00

0.00

0.00

0.00

0.00

0.00

MPI_Init

2.00

0.00

0.00

0.00

0.00

0.00

0.00

MPI_Keyval_free

0.00

0.00

0.00

0.00

0.00

0.00

0.00

MPI_Wtime

0.00

0.00

0.00

0.00

0.00

0.00

0.00

and for timing;

=========================================================================

0: MPI_Alltoall

#calls

time

tim/cal

#calls

time

tim/cal

0

0.0ns

 

12399

10.6s

855.1us

0: MPI_Barrier

0

0.0ns

 

26

1.2ms

45.8us

0: MPI_Comm_rank

0

0.0ns

 

1

3.2us

3.2us

0: MPI_Comm_size

0

0.0ns

 

1

1.4us

1.4us

0: MPI_Init

0

0.0ns

27.9us

1

1.0s

1.0s

0: MPI_Keyval_free

1

27.9us

1

27.9us

27.9us

0: MPI_Wtime

1

1.1us

1.1us

52

33.9us

652.7ns

0: Sum

2

29.0us

14.5us

12481

11.7s

933.5us

0: Overhead

0

0.0ns

 

12481

12.6ms

1.0us

=========================================================================

1: MPI_Alltoall

#calls

time

tim/cal

#calls

time

tim/cal

0

0.0ns

 

12399

10.6s

854.9us

1: MPI_Barrier

0

0.0ns

 

26

2.9ms

109.6us

1: MPI_Comm_rank

0

0.0ns

 

1

3.5us

3.5us

1: MPI_Comm_size

0

0.0ns

 

1

1.5us

1.5us

1: MPI_Init

0

0.0ns

10.8us

1

1.0s

1.0s

1: MPI_Keyval_free

1

10.8us

1

10.8us

10.8us

1: MPI_Wtime

1

1.5us

1.5us

50

36.5us

730.2ns

1: Sum

2

12.3us

6.1us

12479

11.6s

931.1us

1: Overhead

0

0.0ns

 

12479

12.7ms

1.0us

Scali MPI Connect Release 4.4 Users Guide

44

Image 56
Contents Scali MPI ConnectTM Users Guide Acknowledgement Copyright 1999-2005 Scali AS. All rights reservedScali Bronze Software Certificate Grant of License MaintenanceII Software License Terms Commencement Support License ManagerSub-license and distribution Export RequirementsSCALI’s Obligations LICENSEE’s ObligationsTitle to Intellectual Property Rights TransferWarranty of Title and Substantial Performance Compliance with LicensesLimitation on Remedies and Liabilities Scali MPI Connect Release 4.4 Users Guide ViiProprietary Information MiscellaneousGoverning Law Scali MPI Connect Release 4.4 Users Guide Table of contents Profiling with Scali MPI Connect Appendix a Example MPI code Scali MPI Connect Release 4.4 Users Guide Chapter Scali MPI Connect product contextScali mailing lists SMC FAQ SMC release documents Problem reportsSupport Platforms supportedHow to read this guide Acronyms and abbreviationsLicensing FeedbackNIC GUI style font Terms and conventionsTypographic conventions Typographic conventions Description of Scali MPI Connect Scali MPI Connect componentsSMC network devices Direct Access Transport DAT Network devicesShared Memory Device Ethernet Devices3.2 DET Using detctlUsing detstat Myrinet Infiniband4.1 GM 5.1 IBChannel buffer Communication protocols on DAT-devices6 SCI Transporter protocol Inlining protocolEagerbuffering protocol Zerocopy protocol MPI-2 FeaturesSupport for other interconnects Scali MPI Connect Release 4.4 Users Guide MPI-2 Features Setting up a Scali MPI Connect environment Compiling and linkingScali MPI Connect environment variables RunningCompiler support Linker flagsRunning Scali MPI Connect programs Naming conventionsIdentity of parallel processes Mpimon monitor programBasic usage Controlling options to mpimon Standard inputStandard output Program specHow to provide options to mpimon Network optionsMpirun wrapper script Mpirun usageRunning with tcp error detection Tfdr Suspending and resuming jobsRunning with dynamic interconnect failover capabilities Part partDebugging and profiling Debugging with a sequential debuggerUsing built-in segment protect violation handler Built-in-tools for debuggingAssistance for external profiling Debugging with Etnus TotalviewChannelinlinethreshold size to set threshold for inlining Controlling communication resourcesCommunication resources on DAT-devices Using MPIIsend, MPIIrecv Using MPIBsendGood programming practice with SMC Matching MPIRecv with MPIProbeError and warning messages User interface errors and warningsFatal errors Unsafe MPI programsMpimon options Postfix Giving numeric values to mpimonPrefix Scali MPI Connect Release 4.4 Users Guide Profiling with Scali MPI Connect ExampleUsing Scali MPI Connect built-in trace TracingAbsRank MPIcallcommNamerankcall-dependant-parameters where +relSecs S eTime whereFeatures Example Using Scali MPI Connect built-in timing TimingMPIcallDcallsDtimeDfreq TcallsTtimeTfreq Using the scanalyze Commrank recv from fromworldFromcommonFieldsCommrank send to toworldTocommonFields where Count!avrLen!zroLen!inline!eager!transporter! whereFor timing Using SMCs built-in CPU-usage functionality This produces the following reportScali MPI Connect Release 4.4 Users Guide Tuning communication resources Automatic buffer managementHow to optimize MPI performance BenchmarkingCaching the application program on the nodes First iteration is very slowCollective operations Memory consumption increase after warm-upFinding the best algorithm Image contrast enhancement Appendix aPrograms in the ScaMPItst package Scali MPI Connect Release 4.4 Users Guide File format OriginalAppendix B When things do not work troubleshootingWhy does not my program start to run? General problems Why can I not start mpid?Why does my program terminate abnormally? Per node installation of Scali MPI Connect Appendix CInstall Scali MPI Connect for TCP/IP Install Scali MPI Connect for Direct EthernetInstall Scali MPI Connect for Myrinet ExampleInstall Scali MPI Connect for Infiniband Install Scali MPI Connect for SCIInstall and configure SCI management software License optionsScali kernel drivers Uninstalling SMCTroubleshooting Network providers Troubleshooting 3rdparty DAT providers Troubleshooting the GM providerScali MPI Connect Release 4.4 Users Guide Grouping Appendix D Bracket expansion and groupingBracket expansion Scali MPI Connect Release 4.4 Users Guide Appendix E Related documentationScali MPI Connect Release 4.4 Users Guide List of figures Scali MPI Connect Release 4.4 Users Guide Index Transporter protocolSSP