Anonymizer

Most recent version: v0.0.1

See the changelog of this Action type here.

Overview

The Anonymizer Action modifies sensitive data to remove or mask personally identifiable information, ensuring privacy.

Ports

These are the input and output ports of this Action:

Input ports
  • Default port - All the events to be processed by this Action enter through this port.

Output ports
  • Default port - Events are sent through this port if no error occurs while processing them.

  • Error port - Events are sent through this port if an error occurs while processing them.

Configuration

1

Find Anonymizer in the Actions tab (under the Advanced group) and drag it onto the canvas. Link it to the required Listener and Data sink.

2

To open the configuration, click the Action in the canvas and select Configuration.

3

Enter the required parameters:

Parameter
Description

Field to anonymize*

Select an input event field to anonymize.

Anonymize Operation*

  • Hash Anonymizer - Choose this operation to hash any type of data and make it anonymous.

  • IP Anonymizer - Choose this operation if you want to encrypt IP addresses. Note that the input IP addresses must be in IPv4 format.

Salt*

A random value added to the data before it is hashed, typically used to enhance security. Note that the salt length must be 32 characters. Learn more about salt in cryptography here.

4

Click Save to complete.

Example

Let's say we have a list of IPs we wish to anonymize in one of our events fields. To do it:

1

Add the Anonymizer Action to your Pipeline and link it to your required Listener.

2

Now, double-click the Anonymizer Action to configure it. You need to set the following config:

Operation
Parameters

Field to anonymize*

We choose the required field with the IPs to be anoymized.

Anonymize Operation*

We need the IP Anonymizer operation.

Salt*

We're adding the following salt value to make decryption more difficult: D;%yL9TS:5PalS/du874jsb3@o09'?j5

3

Now link the Default output port of the Action to the input port of your Data sink.

4

Finally, click Publish and choose which clusters you want to publish the Pipeline in.

5

Click Test pipeline at the top of the area and choose a specific number of events to test if your data is transformed properly. Click Debug to proceed.

This is how your data will be transformed:

Input data
Output data

Last updated

Was this helpful?