Options
All
  • Public
  • Public/Protected
  • All
Menu

Class PerformanceApi

The underlying api that provides hooks into Node's Performance Timing APIs.

Use this class in place of relying on Node's experimental APIs.

NOTE:

This class is not intended to be instantiated directly. It should be accessed through the PerfTiming variable instantiated by this package. Failure to do so will result in lost metrics in the final file output.

Hierarchy

  • PerformanceApi

Implements

Index

Constructors

constructor

Properties

Private _functionObservers

_functionObservers: CollectionMap<IFunctionObserver> = new Map()

Internal map of all created function timers. The key represents the name and the value is an IFunctionObserver instance.

Private _manager

The manager of this API class.

Private _measurementObservers

_measurementObservers: CollectionMap<IMeasurementObserver> = new Map()

Internal map of all created measurement timers. The key represents the name and the value is an IMeasurementObserver instance.

Private _perfHooks

_perfHooks: "perf_hooks"

This variable holds the import from the Node JS performance hooks library.

timerify

timerify: watch = this.watch

Created for consistent reference to Node Performance Timing API.

see

PerformanceApi.watch

untimerify

untimerify: unwatch = this.unwatch

Created for consistent reference to Node Performance Timing API.

see

PerformanceApi.unwatch

Static Private _errorImport

_errorImport: "/home/jenkins/workspace/we_CLI_-_Performance_master-XVR5DYY5FP3JVHYJIK5P7I4QGOKGZXWQFW7Z2O3UDAQCJCMKMKDQ/src/performance/api/errors/index"

Cache of loaded error objects. Only populated on first call to PerformanceApi._errors

Accessors

Static Private _errors

  • get _errors(): "/home/jenkins/workspace/we_CLI_-_Performance_master-XVR5DYY5FP3JVHYJIK5P7I4QGOKGZXWQFW7Z2O3UDAQCJCMKMKDQ/src/performance/api/errors/index"
  • Internal method for getting the errors that can be thrown. It is smart and will only do the require once and cache the results for later calls.

    Returns "/home/jenkins/workspace/we_CLI_-_Performance_master-XVR5DYY5FP3JVHYJIK5P7I4QGOKGZXWQFW7Z2O3UDAQCJCMKMKDQ/src/performance/api/errors/index"

Methods

Private _addPackageNamespace

  • _addPackageNamespace(value: string): string
  • Gets a namespace scoped string from an input string.

    Parameters

    • value: string

      The string to prepend the namespace to.

    Returns string

    The value with the namespace.

clearMarks

  • clearMarks(name?: string): void

getMetrics

getNodeTiming

getSysInfo

mark

  • mark(name: string): void

measure

  • measure(name: string, startMark: string, endMark: string): void
  • The measure method is used to create a new measurement between 2 marks.

    The measurement name does not have to be unique. Using the same name will add another entry to the corresponding name present in the IMetrics.measurements array in the formatted run output.

    Also, just like mark and clearMarks the names passed into this function are prefixed with a package namespace. This is due to the possibility of multiple instances of the performance api being present because of being part of multiple dependencies of a project. The namespace will attempt to reduce the clashes between packages.

    For example: Assume that the measure method was called like so:

    // marks were created above
    // api instantiated as api
    api.measure("name1", "start1", "end1");
    api.measure("name2", "start2", "end2");
    api.measure("name2", "start3", "end3");
    api.measure("name1", "start4", "end4");

    The final result of getMetrics would look something like this:

    {
      "measurements": [
        {
          "name": "name1",
          "entries": [
            {"startMark": "start1", "endMark": "end1"},
            {"startMark": "start4", "endMark": "end4"}
          ]
        },
        {
          "name": "name2",
          "entries": [
            {"startMark": "start2", "endMark": "end2"},
            {"startMark": "start3", "endMark": "end3"}
          ]
        }
      ]
    }

    Example

    import { PerfTiming } from "@zowe/perf-timing";
    
    // Assume package is example@1.0.0
    
    // Tracked internally by node as "example@1.0.0: before loop"
    PerfTiming.api.mark("before loop");
    
    for (let i = 0; i < 10000; i++) {}
    
    // Tracked internally by node as "example@1.0.0: after loop"
    PerfTiming.api.mark("after loop"); // This will be tracked by node as "example@1.0.0: after loop"
    
    // References the internally tracked "before loop" and "after loop" marks and creates a measurement
    // tracked internally as "example@1.0.0: loop".
    PerfTiming.api.measure("loop", "before loop", "after loop");

    Parameters

    • name: string

      The name of the measurement. Use this to track multiple measurements under the same final array output.

    • startMark: string

      The name of the start mark created with mark

    • endMark: string

      The name of the end mark created with mark

    Returns void

unwatch

  • unwatch<T>(fn: T, name?: string): T
  • Unwatch a function that was watched. It performance is not enabled, it will return the function passed as parameter 1.

    It is recommended to first check manually if performance is enabled before using the watch and unwatch methods for performance reasons.

    Example 1

    import { PerfTiming } from "@zowe/perf-timing";
    
    let fn = () => "test";
    
    fn = PerfTiming.api.watch(fn);
    
    fn();
    
    fn = PerfTiming.api.unwatch(fn);

    Example 2

    // Recommended to check if performance is enabled first due to the need
    // to assign values.
    import { PerfTiming } from "@zowe/perf-timing";
    
    let fn = () => "test";
    
    if (PerfTiming.enabled) // Check if performance is enabled before assigning
      fn = PerfTiming.api.watch(fn);
    
    fn();
    
    if (PerfTiming.enabled) // Check if performance is enabled before assigning
      fn = PerfTiming.api.unwatch(fn);

    Type parameters

    • T: function

      The type of the function passed in the first parameter.

    Parameters

    • fn: T

      The function to unwatch.

    • Optional name: string

      The name that the observer was given, only relevant if the observer was given a name on the watch. If omitted, the value of fn.name will be used as the name.

    Returns T

watch

  • watch<T>(fn: T, name?: string): T
  • Wraps a function in a node monitoring function. This allows metrics to be captured about the watched function.

    Watching a function allows the collection of the following items:

    • The total number of calls to the function while watched.
    • The time each individual call took and the parameters sent in.
    • The total time spent in the function across all calls.
    • The average time per function call.

    NOTE:

    For proper time tracking to occur, the watch function will return the original function wrapped in a Node tracking function. It is important that the original implementation of the passed function be set equal to the return value of this method for metrics to be gathered. Unlike most languages, there is no way for a parameter to be truly passed by reference in JavaScript. Had this been the case, this function would have directly modified the reference in the first parameter so no extra work would have to be done by the programmer.

    NOTE:

    Since Node only uses the function name as the measurement name, it is recommended to ensure that the fn.name passed into this function is unique compared to all functions that you are watching (including those from other packages). Failure to do so will result in a measurement being applied to multiple watches.

    For Example: In the code example below, a call to either of the functions will result in both watches capturing the call.

    import { PerfTiming } from "@zowe/perf-timing";
    
    let fn1 = function test() {
      console.log("fn1");
    }
    
    let fn2 = function test() {
      console.log("fn2");
    }
    
    fn1 = PerfTiming.api.watch(fn1, "name 1");
    fn2 = PerfTiming.api.watch(fn2, "name 2");
    
    // The below line will log fn1 but will trigger a metric entry in the watch
    // for both fn1 and fn2 because fn1.name === fn2.name. To counteract this,
    // simply choose a different name between the two functions.
    fn1();

    NOTE:

    It is recommended to first check manually if performance is enabled before using the watch and unwatch methods for performance reasons.

    Example

    // Track time spent in require calls
    import { PerfTiming } from "@zowe/perf-timing";
    
    if (PerfTiming.enabled) {
      const Module = require("module");
      // Store the reference to the original require.
      const originalRequire = Module.prototype.require;
    
      // Watch a wrapper named function so we can be sure that not just
      // any anonymous function gets checked.
      Module.prototype.require = PerfTiming.api.timerify(function NodeModuleLoader() {
        return originalRequire.apply(this, arguments);
      });
    }
    
    // Continue your application code. Also note, there is no need to unwatch a function.
    // If a function is not unwatched, this will be handled internally by the data collection
    // functions.

    Type parameters

    • T: function

      The type of the function passed in the first parameter.

    Parameters

    • fn: T

      The function to watch. This function will be sent into the Node Timerify function. This will wrap the function passed in a monitoring function.

    • Optional name: string

      An optional name to give the timer. If the name is not specified, then the value of fn.name will be used for tracking.

    Returns T

Static Private _aggregateData

  • Aggregate all entries present in a CollectionMap into an array output.

    Each item present in the map will correspond to a single entry in the final array output. The strings will now become the name property and each data point will be stored in the final object. The statistical analysis present is generated from the IPerformanceEntry instances present in the entries portion of that element.

    Type parameters

    Parameters

    Returns Array<IMetric<T>>

    An array of IMetric data points representing each key/value present in the CollectionMap.

Generated using TypeDoc