Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
You can learn more about how metrics are handled by visiting the Metrics page.
All features on Monitoring have their own metrics tab, containing specific data related to each type of event.
The Event Stores tab will present events related to all event pipes that have occurred, and then will be presented for download as a .JSON file. To learn more about each metric related to this feature, go to the Event Stores Metrics article.
The Event Pipes tab will present metrics related to all your created pipes, it will inform you if the pipes are working correctly. To learn more about each metric related to this feature, go to the Event Pipes Metrics article.
Learn more about how Event Pipes' metrics work.
You can learn more about how metrics are handled by visiting the Metrics page.
To access the metrics tab, select an Event Pipe and all metrics will be displayed, the metrics are populated once you have created event pipes and they are enabled and collecting data.
This metric is populated each time an event is matched to your event pipe's criteria, once it meets your configuration it collects data.
Example: Every time an event meets the criteria established during event pipe creation, this metric will be updated. It is important to monitor this metric, as it indicates whether the criteria are being correctly met; otherwise, there will be no data to collect. In this graph, you can see that at some point, it peaked at nearly 250,000 events. The metric will vary based on the number of campaigns running at the time.
This metric is populated after the matched events succeed, every matched event will be sent to a target execution call, according to the criteria established, this metric will nearly always follow the same parameters as the matched event count.
Example: After an event matches the criteria, our system will execute the established event pipe, collect the data according to the parameters, and then send it to an event store or webhook, as configured. In this graph, you can see the matched event count.
This metric is populated by the failures of a call execution, if there is any error instability on the platform or at the event pipe's configuration, our system will populate the failure rate metric.
Example: If a call execution fails, check your event pipe configuration or if the platform is experiencing instability, as this can cause the call execution to fail. In this graph, you can see that over the period of a month, there was a 1% failure rate on one day, followed by no failures afterward.
Our Event Pipes are used to collect data from all products on our platform. You can use this feature to keep track of everything that can cause an event at BMS, from campaigns to organization creation, allowing you to organize the provided data as you wish.
All event pipes created will be sent to either an Event Store, or you can use a webhook to send this data directly to your data management tool. We provide templates for every event that we can track so you can pick which you want to collect data.
We have a solution that may help you understand how a webhook works, check this article here.
To create an event store, click on , and an event pipe creation screen will pop up.
Name - Set a name for your Event Store.
Tags - Create tags to better identify each event pipe.
To start configuring your event pipe, pick a template of your choice in our Filters tab according to your needs.
Sample Event Template - In this dropdown menu, you will be provided with all templates related to the events that we can track.
In this example, we will be creating an event pipe for ADS - Delivered. This event tracks every Ad that is currently running in a campaign and is delivering impressions. Once we create this event pipe, all data will be sent to an Event Store or a Webhook.
Once we have the template selected, you can check which data will be collected when creating the corresponding Event Pipe.
Path - Specific tag in which you can identify the event. Ex: id, type, source, data.accountId.
Operator - Rule that will be used on this filter,
Value - Which specific value the tag must contain.
In this case, we will be collecting data from ad-delivered events, so this is how the configuration would look.
We specified the Path using the tag "Type", chose the Operator option to "Contains" and added the Value "Ad-delivered". Note that these fields are case sensitive, so the fulfilled fields must match the template.
In this case, our filter failed due to verification in our Value field. These fields are case-sensitive and must match the corresponding information.
Once you have added your filters, move to the targets tab.
Here you will decide where to send the data: whether it will be sent to one or multiple event stores or a webhook.
You must create an event store before using it as a target for your event pipes.
Learn more about Event Stores.
Name your target.
Select "Send to Event Store".
Select the event store you are willing to use.
It is possible to send your data to a webhook provided by your data management tool, select this option to use the data management tool of preference, and then fill out the details.
Name your target.
Select "Call Webhook".
Insert your data management tool's webhook URL.
Payload - Information that will be sent to your webhook.
Result - Status returned and the latency.
Request - Request used by BMS to send you the information for this test.
Response - Received response from your webhook.
When selecting two or more event pipes, the Bulk Actions will be enabled, allowing you to make actions in bulk.
You will be capable of Archiving and Deleting events in bulk.
We advise users to archive instead of deleting, only delete if you are sure of it, the action cannot be undone.
In our Event Stores tab, you can create a store that acts as a database for storing events. You will be able to create a store where your event pipes will be sent. You can then download the events as a JSON file and import that to your data management file to organize it better.
Name - Set a name for your Event Store.
Tags - Create tags to better identify each event store.
Retention - Decide for how long you will be keeping the data collected by this event store, after expiration, the data will be deleted.
When selecting two or more event stores, the Bulk Actions will be enabled, allowing you to make actions in bulk.
You will be capable of Archiving, Deleting, Enabling, and Disabling events in bulk.
Attention! If you delete an event store, all data related to that event store will be deleted.
Once you create your event stores and pipes, all triggered events will be presented on your Event Streams tab, being available for download according to your data retention settings.
These data will only be displayed if there is an event pipe sending data to an event store.
The JSON-line format offers a raw and unfiltered view of the event stream, capturing all relevant information as events occur. This format ensures that no data is left out, giving you complete visibility into each transaction for detailed analysis or integration into custom workflows.
The CSV format provides a customizable and organized view of the event stream, enabling you to choose specific data fields for export. This format makes the data easier to analyze in spreadsheet applications or reporting tools while retaining essential information needed for insights into advertising performance and activity.
Label - Inform the desired label for the column
Read Data From - Inform the parameter from where you want to read data
Default Value - Fill this with a default value for the column, this field is optional.
Attention! Be careful when deleting event streams, this action cannot be undone, meaning, your data is permanently removed.
These are all the metrics available in the Monitoring product for analyzing events. Additionally, when checking metrics, you can always check our to access additional information about a specific metric.
To collect the data, we must insert a filter, by clicking on .
Once you have fulfilled all fields with the corresponding information based on your event pipe, the test filter must have a check confirming that it is working properly . If it is not working properly due to missing information or a typo, it will be presented with a warning sign .
To configure a target, click on and fill out the details according to the chosen target option:
Click on to save your target.
Once you have configured your webhook URL, use our Test Webhook tab to confirm the usability of your webhook tool. Select one of the sample event templates and click on .
If your webhook test is successful, your icon will be presented as and the tabs for your test will be fulfilled.
After finishing all configuration and tests, click on to save your target.
Once everything is configured, click on and your event pipe will be listed.
To enable an event pipe simply flip the toggle and your event store will be enabled.
To edit your event store, click on and an editing screen will pop up, make the necessary changes and then click on .
To archive your event pipe, click on and it will be sent to the archived list. In order to unarchive your event pipe, switch your view to archived events by flipping the toggle . You will then be presented with the list of archived events. Click on to unarchive an event pipe.
To delete an event, click on and a confirmation will be required.
After clicking on , your event pipe will be deleted.
To create an event store, click on then an event creation screen will pop up.
After configuring your event store, click on button to finish your event creation.
To enable an event store simply flip the toggle and your event store will be enabled.
To edit your event store, click on and an editing screen will pop up, make the necessary changes, and then click on .
To archive your event store, click on and your event store will be sent to the archived list. In order to unarchive your event store, switch your view to archived events by flipping the toggle . You will then be presented with the list of archived events. Click on to unarchive an event store.
To delete an event store, click on and a confirmation screen will pop up.
To confirm your deletion click on .
You are able to download your event streams by clicking on at the same row as the event stream you want to download. There are two formats of files available:
Select this format and then click on to download your event stream.
It is possible to add or remove columns to your CSV file. To add a column click on , and then fill in the details:
To remove a column, select it in the list and then click at .
After finishing your settings, click on to download your CSV file.
You can select multiple event streams by checking the , after selecting more than one event, the bulk options will enable, you can download the selected event streams by clicking on , and selecting one of the file formats available.
It's also possible to bulk delete your event streams, click on , then select delete, and confirm the deletion when it is asked.
To delete an event, click on and a confirmation will be required.
After clicking on , your event data will be permanently deleted.
In our Monitoring section, you will be able to create event stores and pipes related to all of our products and your currently active campaigns. This will provide you with valuable data to study better solutions for your campaigns and improve your strategy.
It is possible to create numerous event stores to store data and set your own data retention settings, allowing you to adjust the retention according to your planning. Learn More about Event Stores.
Here you will decide which events you will collect data from, ranging from an active campaign to any event that happens within our platform. This provides ample data to help you keep track of any important product that you are focused on. Learn More about Event Pipes.
In the monitoring tab, it will be presented all bills related to monitoring, which are divided into 4 sub-sections.
At BMS, it prioritizes transparency by displaying every detail of your bill. Visit our Billing Home to understand how the bills are structured.
Every action taken on an event pipe will generate a request. For instance, creating an event pipe, listing your event pipes, and making changes to an event pipe will each generate a request. Additionally, each event pipe will create event logs per hour, which will then be charged based on a 720-hour period.
Example: On this bill, only the configured event pipe has been charged. By observing the number of hours, you can see that the configured event pipe has been active for a month and a few days, resulting in a total bill of $0.79. No other charges occurred since no requests exceeded the first 1,000 free requests.
On the Event Store page, all actions will count as a request. If you access the page, create an event store, check your available event streams, or download an event stream, each of these actions will count as a request. However, downloading an event stream will also incur charges, as will storing event streams. These charges will be based on the size of the event stream and the duration it is stored in BMS.
Example: On this bill, it is possible to see that most requests were not charged due to not reaching the free 1,000 requests. However, the event store requests incurred a $1.95 bill due to the high number of event streams generated within that event store. Having a configured event store also incurred charges because event streams are generated every hour, resulting in a $0.47 bill. Additionally, storing these generated event streams on the BMS server (if the webhook feature is not used) incurred a $0.04 bill, bringing the total bill to $2.46.
All metrics on any platform incur charges and are crucial for analyzing performance and making strategic decisions based on the collected data. BMS centralizes the charges for every metric on the platform under the Monitoring tab, which is responsible for receiving all events within the platform.
Metrics are charged per point recorded and for the bytes processed by BMS's servers to generate these metrics.
Example: On this bill, it is noted that nearly 17 million metric points were accumulated within a month. This total is generated by summing all the metrics on the user's account, resulting in a cost of $83.47. Additionally, the amount of data scanned to generate these metrics was nearly 119 thousand GB, resulting in a charge of $2.37, for a total bill of $85.84.
Some products have a real-time tab that displays events from your campaign as they happen. For example, when an ad is displayed, the user's location will be shown based on the ad exchange used, along with additional information. Each time the real-time tab is accessed, it will incur costs.
Example: In this case, the requests for recent real-time events were relatively low but exceeded the 100,000 free events mark, resulting in a charge of $0.14.
Learn more about how Event Stores Metrics work.
You can learn more about how metrics are handled by visiting the Metrics page.
Select an event store to check the metrics populated by the event pipes, note that it is necessary to use an event store to populate when creating an event pipe, if you send your events only to a webhook, the event store will not be populated.
It Is populated by having Event Streams available for download, according to the size of each event stream, this metric will present the sum of all your event streams' bytes.
Example: After having an event store populated, it will present event streams related to the pipes selected. Then you will be able to check the size and date of each pipe separately. The graph will populate as the stream's size increases. It is possible to see on this graph that it reached nearly 2 MB in size within a week.
It presents the count of each event stream presented on your event store, if the event store has many pipes related to it, the count will be high.
Example: When creating event pipes, you must select an event store or a webhook to send the data. Once you select an event store, each pipe you create to the same event store will generate an event stream when receiving data. It is important to note that if you have too many event pipes to the same event store, it is advisable to separate them in order to better organize your event streams. On this graph, you can see that the stream count reached a peak of nearly 200 streams.
Presents the total amount of bytes uploaded to your event streams tab, leaving streams available for download.
Example: Each event stream is uploaded to your selected event store after a few minutes, being available for download as a .JSON file. The metric will present the sum of all uploaded event streams' bytes to your event store. On this graph, we can see that on specific dates, the data was significantly reduced due to the expiration date of each event stream or deletion.
This metric presents the total upload count, related to each event stream's data collected and presented for download, once they expire or get deleted, the count will decrease.
Example: On this graph, it is possible to see that on some dates the count reached nearly zero due to having no active event pipes or because of a cleanup. It is important to check your upload count since it might impact how you are organizing your event streams. Having too many event streams to organize can be confusing, so a cleanup or optimization should be done from time to time.
This metric represents the sum of all downloaded bytes of your event streams and will be populated only when the event stream has been downloaded.
Example: Once you start downloading your event streams' data, you will be informed of the total number of bytes downloaded each day. This helps you keep track of how much data was downloaded. If the data volume is high, it indicates that many events were captured from the event stream. In this graph, you can see that only 2 KB of data was downloaded.
This metric represents the event streams downloaded, which will be populated every time you download an event stream.
Example: If you organize yourself to download the collected event pipe data every week or every two weeks, the metric will inform you so that you don't lose track of your established schedule. In this graph, you can see that the event streams were downloaded only after a week.
This metric is populated every time an event stream is deleted by showing you the sum of all bytes deleted, either manually or by the retention date set upon event store creation.
Example: Once you start collecting data for your event store, you will set a retention date for the data. Every time the data reaches that date, your event stream will be deleted, or you can also delete it manually. In this graph, you can see that, at some point, nearly all event streams were deleted.
This metric represents the total count of deleted event streams, they are populated either by retention deletion or manual deletion.
Example: On this graph, you can see that the delete count follows a pattern, but on some dates, a lower amount was deleted. This could be due to a change in the retention date or fewer event streams being enabled. Deleting data can help you keep it more organized.
These are all the metrics available in the Monitoring product for analyzing events. Additionally, when checking metrics, you can always check our to access additional information about a specific metric.