Wikipedia
Search results
Friday, 28 November 2014
Wednesday, 26 November 2014
Friday, 14 November 2014
Salesforce recently announced their new Streaming API that given a SOQL query can associate it with a topic that applications can subscribe to. This means any application subscribed on that topic will receive updates as they happen in realtime.
Streaming Schreaming. So What?
Realtime data over the web sounds good but how much improvement is there over polling? Well for starters polling isn’t very elegant, asking an application for changes is hugely inefficient. Secondly, polling APIs mean that you have rate limit restrictions, these restrictions exist to reduce load on the APIs infrastructure, and rate limiting on SaaS applications is often restricted to the number of calls a day which means you usually cannot even get close to real time without paying a lot more. Finally, the REST paradigm (the one used by 90% of public APIs) is not well suited for realtime, Streaming APIs work around the limitations to allow data to be pushed to the client in the same way AJAX does i.e. CometD and Web Sockets.If you think back to days when AJAX term was coined, many people didn’t see the true value of what the technology offered until the first Web 2.0 web sites started to get attention. To the less informed, AJAX seemed to just remove the need for the user to refresh their page in the browser. I believe streaming APIs is a major step in moving towards an event-driven Web.
The SalesForce Streaming API
Salesforce is not the first company to announce a streaming API, Twitter, Facebook and others started to emerge almost a year ago with streaming APIs. What’s different about Salesforce is that they are the shining light for most other SaaS companies (after all they pretty much defined the category) and they’ve started the ball rolling and others will follow suit. Getting our application data at realtime isn’t something that we’ve had before, it will change the dynamics of the applications we build.Of course, there is a big win for Salesforce with their streaming API, with over 50% of all traffic going through their APIs, that’s a lot of applications nagging their infrastructure asking for updated information. The number of requests that return no data could be as high as 60%. That’s a lot of inefficient processing. The Salesforce platform can be greatly optimised if people start using the streaming APIs since a single query result set could be filtered to serve hundreds or thousands of customers.
Streaming APIs and Mule
Mule is the first integration platform to support the SalesForce streaming API, and to test it out we create a simple demo application with Salesforce Chatter and Twilio running on our cloud platform; Mule iON.
The Demo is pretty simple, when you post to
someone’s chatter wall hashtags can be used to perform actions. In this
case if you post to the wall and include the #now hashtag the recipient will will be notified of your message through SMS.
To make this possible we created a Salesforce custom object
(SMSNotification) that includes the user’s mobile number to send SMS
messages on. We created a trigger to update this custom object with FeedItems posted on the users Chatter wall. We then created a topic called ‘/SMSNotificationStream‘ that selects everything in the SMSNotification object. So when a new post on a wall is made the SMSNotification object gets updated with the item and has a mobile number associated with it that is used to send SMSes.One of the nice things about this demo is that is super easy, there is only a few lines of Mule XML to make it work:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 |
|
- You’ll see that the flow starts with ‘sfdc:subscribe-topic‘ this initiates a streaming connection to Salesforce and will trigger the flow when new data arrives.
- The filter is used to detect hashtags. Here we only look for #now, but we could look different ones and react to each separately
- The Twilio ‘send-sms-message‘ command extracts the number and message to SMS from the payload received from Salesforce
- This configuration can be dropped in to Mule iON and happily sit there listening and reacting to Chatter; like a message switchboard in the cloud
- This all happens in realtime!
Let me at it!
Right now the Salesforce Streaming API is in beta and the streaming connector for Mule will not be officially released until Mule 3.2 in a few weeks. If you want to try it out before that please let me know (tweet or email ross at mulesoft).Follow: @rossmason, @mulejockey
Source : Mule Blog
With the rapidly increasing adoption of SaaS, integration Platform as a Service (iPaaS) has become the preferred way to connect SaaS applications. However, with the explosion of Open APIs on the Web connecting APIs
together is becoming the norm for application development. However,
typical application containers and even application PaaS offerings don’t
help in this new era where applications compose APis together from many
different sources.
In the iPaaS world composing APIs together is modus operandi.
An application built on an iPaaS is focused on connecting 2 or more
systems together via APIs in order to synchronizing data between them.
However, we’ve taken this new breed of applications much further with
the concept of Integration Apps.Integration Apps exist in a world where everything needs to connect. Open APIs define thousands of new endpoints for exchanging data and leveraging new functionality and Integration Apps are optimized for dealing with working with many different data sources, focusing on composition rather than just coding.
To explain Integration Apps lets take a look at the anatomy of a traditional Web application.
This will be familiar with any developer;
it’s the traditional 3-tier application model that defines how most
applications to date have been built. There are a few problems with
this architecture in today’s API-centric world:
- The database has traditionally been the source of truth, but now applications work with many data sources. Increasingly apps need to read and write from APIs from different 3rd party providers.
- The App Server is just an HTTP container. It doesn’t provide much in the way of capabilities other than hosting code and mapping HTTP requests to an application.
- Custom logic is a big bucket where the application logic resides. This is where data access and application code is hosted. Developers often use open source frameworks such as Ruby on Rails and the Spring Framework to make creating applications easier. But these frameworks don’t provide much for dealing with connecting to lots of data sources or working with different data formats.
- Traditional web apps are user-focused, however new applications need to cater for machines too. Increasingly, more people think about building applications API-first. That is creating an API for the application that can be consumed by JavaScript, native mobile applications and other applications.
Even when running on PaaS, applications don’t change. This is deliberate since PaaS has focused on getting existing applications into the cloud. However, this does nothing for dealing with the explosion of Open APIs.
Introducing Integration Apps
In contrast Integration Apps embrace the need to connect to APIs and work with multiple data sources, data formats and mediating between different applications. The iPaaS platform provides completely new types services for dealing with interactions for other remote systems including monitoring, tracking, governance and mediation. This is needed because Integration Apps take an message-driven approach to connecting APIs. This means rather than making only synchronous calls in code, interactions are defined through messages being passed between components. This will be familiar to developers that understand newer languages such as Node.js or Scala were messages are passed to listeners or between Actors.With Integration Apps there are more capabilities built in so the developer doesn’t have to do heavy lifting.
Connectivity is focused on working with lots of different
APIs, this layer manages the security, session management and monitoring
of connections.
The ‘Custom Logic’ is joined by new capabilities to deal
with composing different APIs together using Orchestration. Data Mapping
is needed since the data exchanged via APIs comes in differentformats,
so being able to work with XML, JSON, RSS, ATOM, CSV and legacy
formats is really important. There is also more focus on Error handling
since interactions between different systems needs to be clear and
visible.
iPaaS offers the same services as PaaS such as database,
storage. But because these applications are message-driven there is a
whole new set of platform services that help you track information
between systems, set up alerts and error handling with message replay.
Integration Apps don’t only connect with Open APIs on the
Web, often connecting with on-premise applications and data sources is
needed so the notion of a data gateway is important.
One element missing from the above picture is the User Interface.
This is because increasingly, applications are being built to serve
machines not people, making UI optional. Synchronizing data between a
few applications doesn’t need human interaction; nor do many automated
business processes. However, Integration Apps natively support
publishing ReST APIs so that other applications and mobile devices can
interact with an Integration Apps.Integration Apps in the wild
MuleSoft already offer public Integration Apps that synchronise data between SaaS applications or between SaaS and on premise applications like SAP. The most well known Integration App out in the wild is dataloader.io. This is an Integration Apps that allows users to upload data into SalesForce – a very common task. The dataloader.io iApp is the only cloud-based solution for Salesforce and has become very popular taking the number 1 spot on the Salesforce AppExchange. This Integration App looks very much like the diagram above: it has a JavaScript UI that talks to the app via its ReST API, the Integration Apps uses a mix of Orchestration, Data Mapping and custom logic to allow users to create jobs for loading data into Salesforce.Without iPaaS and Integration Apps capabilities this application would have taken months to build from the ground up with all the monitoring, management, error handling and connectivity, instead we built it in 4 weeks. And now it serves 1,000s of Salesforce users per day.
Faster, more Productive
The combination of iPaaS and Integration Apps is very powerful, and enables a new type of application that responds to changes in real-time. Pushing more services and functionality into the platform drives a configuration over coding approach which means developers can focus on composing their application rather than coding everything from the ground up. If necessary the developer can insert custom logic into their App but for most scenarios it isn’t necessary, with orchestration and data mapper providing the tools to work with APIs and different data formats.The applications that take advantage of these new capabilities will create richer, more engaging applications. This new breed of applications will further fuel the Open API explosion adding new APIs that can be consumed people and machines. Open APIs power a world where everything needs to connect, its time for a new platform that embraces this.
Follow: @CloudHub, @MuleSoft, @rossmason
Source: Mulesoft Blog
Thursday, 13 November 2014
Wednesday, 12 November 2014
Friday, 7 November 2014
Thursday, 6 November 2014
Friday, 31 October 2014
Friday, 17 October 2014
Thursday, 18 September 2014
Wednesday, 17 September 2014
Monday, 15 September 2014
Saturday, 13 September 2014
Saturday, 6 September 2014
Sunday, 31 August 2014
Saturday, 30 August 2014
Friday, 29 August 2014
Thursday, 28 August 2014
Monday, 25 August 2014
Sunday, 24 August 2014
Tom Stroobants on Tuesday, April 17, 2012
Mule School: Integration with Social Media: Part I – Twitter
Today’s post will focus on connecting to Twitter and sending a tweet (if you don’t know what Twitter is read this). Subsequent tutorials will cover:
Mule Server and Studio versions
For this integration, I am using the latest version of Mule ESB Community Edition with Mule Studio (1.0.0). This sample can also be run in standalone Mule ESB Community Edition and Mule ESB Enterprise Editions.
Mule Studio comes with built-in Twitter connector we can straight away use in Studio. Lets build a new twitter flow that looks like below. We will create an HTTP inbound end point that forwards request to Twitter connector. Finally, the Twitter connector returns a twitter4j.StatusJSONImpl object that will be transformed using an expression-transformer to display response object’s string representation.

Let’s build the sample now.
- Create a new Mule flow and name it “twitter”.
- Drag and drop a new HTTP inbound end point on to “twitterFlow1″. Double click on HTTP icon to bring up properties dialog. Specify “addtweet” for Path field.
- Click on “Global Elements” tab and click on Create to bring up Global Type dialog box. Select “Twitter” from “Cloud Connectors” section. Leave default values and click Ok. We need to configure twitter account to generate necessary security tokens. I will explain this process in next section.
- Drag and drop Twitter connector next to HTTP inbound end point. Double click on Twitter icon to bring up properties dialog. Select Twitter connector we created in previous step for “Config Reference” field. Select “Update status” for Operation field. Finally specify “#[header:INBOUND:mymessage]” as Status. This expression extracts “mymessage” parameter value from HTTP request.
- Finally drag and drop an “expression transformer” next to “Twitter” connector. Double click on Expression icon to bring up properties dialog. Specify evaluator as “groovy” and expression as “payload.toString()”. More on expression transformers can be read from Mule 3 documentation.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
|
1 |
|
Anyway, what do all these attibutes mean?
The “consumerKey”, “consumerSecret”, “oauthToken” and “oauthTokenSecret” are in fact keys that are generated by the Twitter application. (More on that in a minute.)
Configure Twitter
Before your are able to start using the Twitter integration you will have to do some configuration in your Twitter account.
Go to the following url: https://dev.twitter.com and sign in with your Twitter username and password.
First, you should add an application:

Fill in all the fields in the screen and agree to the “Terms and Conditions.”

Once your application has been generated, you can choose from a tab bar to configure your application in more detail:

You will see that the consumer key and consumer secret are already generated but the access level is Read-only. If you want to read more information on the Twitter permisson model you can click on the link.

To authenticate your application with your Twitter account you will have to generate authentication keys. This is not done by default:

Click the button to create your access tokens and the following screen will appear:

By default the access level is Read-only. If you need access to direct messages you will have to update your Access level. This can be done in the following ways:
Consumer keys
Go to the Settings tab and adjust the access level:

OAuth keys:
You should recreate your access token (after changing the access level in the settings tab) if you also want to update your access level of your OAuth keys.

Running the application
Right click on twitter.mflow and select Run As Mule Application. Once application has successfully started, you can test the application with the following URL: http://localhost:8081/addtweet?mymessage=hello. Do check your twitter account to see a new tweet with message “hello”.
A successful twitter update will result in following response:
1 2 3 4 5 6 |
|
1 2 3 4 5 |
|
And that’s it! Have fun!
This is a guest post from Mule community member Tom Stroobants. Thank you Tom! (we’ll be sending you a cool T-shirt). If anyone else in the Mule community would like to write a guest post, please email us.
One of the reasons why we are partnering with THINKstrategies is to help companies see how an integration-platform-as-a-service (iPaaS) can accelerate their Big Data and cloud analytics projects.
The integration challenges around Big Data and cloud analytics tend to be twofold. First, it’s important to have your data in a central place, and second, it’s extremely important to collect and analyze that data in real time.
Ask yourself, how helpful would it be to have analytics from only 50% of your data sources? Or how about 1-2 month old analytics about your business? By the time you collected information from all the data sources and crunched the numbers, your market opportunity may have passed you by. Today’s data sources are more distributed, and as more companies look to SaaS offerings like Workday, Box, and Salesforce.com for their core business applications, their big data and integration challenges are only going to get bigger.
We are participating in a panel discussion at the conference to explore this topic and more! We hope that you’ll join us for the discussion and stop by to see us at the expo hall. Here are the details:

About the Conference:
The Cloud Analytics Summit
April 25, 2012 | Mountain View, CA
Computer History Museum
Website: http://cloudanalyticssummit.com/
Working with Databases (JDBC) in Mule Studio
In this blog post, I’ll give you some background information about JDBC, explain what Mule ESB and Studio do with JDBC, and demonstrate how can you use it in a simple example.
Optionally, you can go to the Queries tab now and create the queries
or SQL statements that you want. If you don’t do this now you will have
to do it when configuring an endpoint.
No related posts.
A little reference for JDBC:
JDBC, which stands for Java Database Connectivity, is basically an API that enables users to execute operations over a Data Source using the Java programming language. This API allows you to connect to almost any Data Source system, from relational databases to spreadsheets and flat files and, using the proper SQL syntax, you can perform queries, updates, deletes, or even execute store procedures.What Mule ESB and Mule Studio do with JDBC
Now let’s see how this is architected in Mule ESB. What Mule ESB does is to make this Java code layer transparent to you. Simply importing a jar file with the driver for a specific data source (MySQL, Oracle, etc) and writing some easy XML code will make you able to connect to a Data Source and manipulate the data in it. Studio comes with a friendly User Interface, which makes Mule XML code very easy to create and edit. The image below gives you a better idea of how all this works:
At the very end of the line is your
data source, which can be fed by any other application. Next you have
the JDBC driver. As we mentioned earlier, this is the Java API
interface provided by the vendor code of the Data Source that will
allow Mule to connect to the Data Source and manipulate the data in it.
What comes next is our Mule ESB instance, which will be the service that
will be executing the Mule XML code. And finally we have Mule Studio and you.
Studio gives you the framework to easily
create the XML code you need and will allow you to test it by executing
the code in an embedded Mule ESB instance. So by using Studio, the other
layers will be transparent to you.
My kingdom for a Driver!
Before configuring a JDBC connection the first thing we need is the Driver. If you want to keep your kingdom you should first go to the vendor website and look for a JDBC driver file, which should be in a jar format. Keep in mind that there are some vendors, like Oracle, that may require a license to use the driver. NOTE: On www.jarvana.com you can look for the Driver class you need and download the jar file from there. In the example explained below we are going to work with a MySQL database. You can download the Driver file from here (registration required) or look for the connector class in jarvana.Putting hands to work
Open new Mule Project in Studio, and then follow these steps to get your flow working: a. Import the driver b. Create a Datasource, c. Create a Connector that uses our Datasource, and finally d. Create a simple flow that uses our connector.a. Import the Driver
Once you have the jar file, the next steps are very simple:- In the Package Explorer, right-click over the Project folder ( in this case “jdbcprj”).
- Look in the menu for Build Path > Add External Archives…
- Look for the jar file in your hard drive and click Open.
b. Creating a Datasource
Mule and Studio come with some predefined configuration elements for the most common datasources: Derby, MySQL, Oracle and PostgreSQL. If you want to use another datasource, you can do it by creating a bean object with the configuration and using the bean as the Datasource. No let’s create a MySQL datasource for our connector:- Go to the Global Elements tab and click on the Create button, which will display a new window.
- Look for Data Sources > MySQL Data Source and click the OK button.
- In the Data Source configuration window only 3 things are need to make this work: the database name in the URL, the User and the Password. Enter those attributes according to your database configuration and click OK.

c. Create the Connector
Now that we have the datasource with its driver we need a Connector.- From the Global Elements tab, click on Create and look for Connector > Database (JDBC). Then click OK.
- The only thing that we need to do here is tell the connector which datasource to use. To do this click on the ‘Database Specific’ drop-down list and look for our datasource created in the previous step. Then click OK.

d. Creating a flow
Now, we have the half of the work done. To use our Datasource in a flow, we need an inbound endpoint or an outbound endpoint, depending on what we want to do, you can use a jdbc inbound endpoint if you want use de information from a database to feed your flow and do some process or use an outbound if you want to write the information you process in your flow in a database. In any of these cases you need to do this:- In the Studio Message Flow view, add a JDBC endpoint (either inbound
or outbound) in the flow, and open the configuration window by
double-clicking on the endpoint.* Note: To add the endpoint
you just need to look for it in the palette and drag and drop it into
the canvas, if you drop it in the canvas out of any flow then a flow
scope will be created and you endpoint will be an inbound endpoint, if
you drop it in a flow after any element, then you will have an outbound
endpoint. Studio automatically perform this conversions as flows should
always start with inbound endpoints:
- Go to the reference tab and in the connector drop-down list, look
for the JDBC connector created in the step C. We are telling the
endpoint how to connect to the data source by specifying a reference to a
connector. The connector configuration is something global so it can be
reused in any amount of endpoints that you want.
- Go to the General tab and select the Query Key you want to use in
this endpoint. The JDBC endpoint can execute one SQL statement. If you
have not created the query in the connector then you can do it now by
going to the Queries tab.* Queries Tab and a New Query
* Query selected in the Query key drop down list:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 |
|
Saturday, 23 August 2014
Salesforce Bulk API Integration using Mule ESB
Salesforce CRM has been widely used in
organizations as part of managing their customer interactions. However,
Salesforce with the Cloud Delivery model, has become difficult and
expensive for the organizations to custom code their integration of
Salesforce with their existing on-premise systems.
Many organizations have a need for this
integration in a cost and time effective way to automate their business
processes. As a solution to this problem, WHISHWORKS has a way to
integrate the company’s existing systems with Salesforce using a modern,
lightweight and low cost Mule Enterprise Service Bus.
WHISHWORKS and Salesforce Bulk API
WHISHWORKS has extensive experience
utilizing the Mule ESB Anypoint Salesforce Connector to connect directly
with Salesforce APIs. This connector enables users, access to full
Salesforce functionality with seamless Salesforce integration.
In a business scenario, where there were
huge volumes of data to be migrated to Salesforce from the company’s
multiple existing systems, WHISHWORKS has implemented an effective way
of integrating with the Salesforce Bulk API.
Mule ESB flows have been designed in a
way that can be reused for both initial and operational loads.
Transformation of data has also been provided in the process of data
import to give a standardized and consolidated form of data in
Salesforce.
How did we tune Salesforce Bulk API
Bearing in mind the different
constraints Salesforce Bulk API has, WHISHWORKS has tuned the batches
uploading to Salesforce in an effective manner enabling seamless
business automation between Salesforce and the existing Database
systems. Here is how:
- Threading Profile Settings: Salesforce allows a maximum of 25 concurrent threads at a time. To restrict the concurrent calls to not more than 25, threading profiles have been created at the flow and the VM endpoint level in which the Salesforce calls reside.
- Salesforce Batch Tuning: Each Salesforce batch being created is tuned such that the data size of the batch does not exceed 10MB. Tuning parameters have been configured to change the number of records each batch holds depending on the size of the entity.
- Time Delay between each Salesforce Call: Loading huge volumes of data to Salesforce in batches running concurrently can cause Salesforce to take longer time for processing the batches. To avoid this, a time delay has been provided between each concurrent call to avoid overloading Salesforce.
- Parallel Garbage Collection: To utilise the JVM memory efficiently while importing the data, Parallel Garbage Collection has been used to clean the Java objects that are not anymore strongly referenced.
All this was done on Mule ESB!
Benefits to the Customer
The Salesforce Integration with the organization’s multiple systems has provided the following benefits to the customer:
- This has enabled the customer to be able to streamline and fully automate their business processes.
- Scalability: Integration through Mule ESB has enabled adaptation to any new SOA infrastructure that needs to be defined as part of the company’s changing infrastructure.
- Speed of Integration: With an underlying platform that contains a single development environment and a reliable multi-tenant architecture, integration to Salesforce has been quickly and efficiently built.
- This has provided them with an ability to integrate more systems and to aggregate data for a consistent and accurate overview of business.
- Significant cost savings by using low cost Mule ESB Enterprise.
Author: Harika Guniganti is a Masters of Computer Science graduate having 4 years of experience as a Technical Specialist with WHISHWORKS. An experienced hand at integration and Mule ESB, Harika loves her cooking, crafts and travelling.
API Analytics
Viewing API Analytics
Access the Analytics dashboard for your Anypoint Platform for APIs organization to get insight into how your APIs are being used and how they are performing.
Contents
Assumptions
In order to access the Analytics Dashboards, you must be a member of the Organization Administrator role for your organization. Users who are API Creators or API Version Owners can access the Top 5 APIs chart on the API administration page, but cannot access the Analytics Dashboards.Accessing the Top 5 APIs Chart
Organization Administrators, API Creators, and API Version Owners can view a snapshot of the latest activity for your organization's five most-called APIs at the top of the API Administration page. Note that if you don't have any APIs yet in your organization, this chart will not appear.
Once you have data to display, the Anypoint Platform rolls together all the API calls made to all versions of an API and combines them into a single line of data. Each of your top five APIs is represented by a different color.
On this chart, you can:
- Hover over the items in the legend to highlight a single API.
- Hover over a peak to view a tooltip with details about the total number of API requests received for that collection time.
- Change the zoom by clicking the links underneath the chart title. You can view data for the last hour, three hours, day, week, or month.
Accessing the Analytics Dashboards
As an Organization Administrator, you have access to the Analytics Dashboards for your organization. Go to anypoint.mulesoft.com/analytics to access your Analytics Dashboards. You can also navigate to the overview dashboard by clicking the Analytics> link above the Top 5 APIs chart on your API Administration page.
If you don't see the Analytics> link, you are not a member of the Organization Administrator role.
Navigating the Overview Dashboard
When you access the Analytics for your organization, by default you land on your Overview Dashboard. This dashboard displays four standard charts:- Requests by Date: Line chart that shows the number of requests for all APIs in your organization.
- Requests by Location: Map chart that shows the number of requests for each country of origin.
- Requests by Application: Bar chart that shows the number of requests from each of the top five registered applications.
- Requests by Platform: Ring chart that shows the number of requests broken down by platform.

All four of these charts display, by default, the data for all APIs in your organization for the past day. However, you can use the filters at the top left of the page to change the date range or filter to particular APIs. Note that the time ranges displayed automatically reflect your local time zone.

All of the charts on the Overview Dashboard are cross-filtered. This means that if you filter the data on any one of these charts, the same filter is automatically applied to the other charts on the page. Clicking on an individual application in the bar chart, for example, displays all of the requests from that application and the locations for those requests on the map. Here's how to filter data on individual charts.
Chart
|
How to filter data
|
How to clear your filter
|
---|---|---|
Requests by Date | Click and drag an area of the chart to filter to just that time period. Once you have the slider filter applied, you can drag the ends to the left or right to adjust them as needed. | Click outside the area of the filtered portion. |
Requests by Location | Click one or more countries to filter to just those results. Hover over a country for a tooltip displaying the name of the country and the total number of API requests received from that country for the selected time period. | Click the country or countries again to reset the map. |
Requests by Application | Click one or more application bars to filter to just those results. Hover over an application's data bar for a tooltip displaying the name of the application and the total number of requests from that application for that time period. | Click the application(s) again to reset the chart. |
Requests by Platform | Click one or more segments to filter to just those results. Hover over a segment for a tooltip displaying the name of the application and the total number of requests from that application for that time period. | Click the segment(s) again to reset the chart. |

Note that even if you have filtered data on one of the charts to show only selected data, the export icon triggers an export of a .csv file of the full data for that chart, filtered by whatever date range and API selection you have made using the filters in the upper left of the page.
Creating Custom Charts
The Anypoint Platform for APIs allows you to create a wide variety of custom charts to display exactly the data that you wish to track for your APIs. You can display these charts on your Custom Dashboard.For example, you can create custom charts that show:
- Hourly transactions per second between first day of the month and today, filtered by client id, API version, or SLA tier.
- Per minute latency average in the last 24 hours, filtered by API or grouped by client geolocation.

- On the Charts page, click New to create a new custom chart. You are directed to the Create Chart screen.
- Give your chart a Title, and, optionally, a Description.
- Click one of the four thumbnails on the left of your preview to select the chart type.
Available chart types:
- Line chart
- Bar chart
- Ring chart
- Map chart
- Use the drop down options to select a data source, a metric, an aggregation (if relevant), and a data interval (for line charts) or grouping dimension (for other chart types).
Available data sources:
- All APIs in your organization or a single API version
- Requests
- Response size
- Request size
- Response time
- Minutes
- Hours
- Days
- API Name
- SLA Tier
- API Version
- Hardware Platform
- OS Family
- OS Major Version
- OS Minor Version
- OS Version
- Browser
- User Agent Version
- Application
- Client IP
- City
- Continent
- Country
- Postal Code
- Timezone
- Resource Path
- Request Timestamp
- Response Timestamp
- Status Code
- User Agent Type
- Verb
- Click Save Chart when finished.
See the next section for information about how to add charts to your Custom Dashboard.
Creating a Custom Dashboard
Once you have created some custom charts, you can display them side by side on a custom dashboard that is unique to you. Any other members of the Organization Administrator role do not share your custom charts or custom dashboard – these views are unique to each user.To access your custom dashboard, click the menu icon in the upper right of the page and select Custom Dashboard.

- The first time you open your custom dashboard, it will be blank. Click Edit Dashboard in the upper right.
- Drag and drop charts from the drawer on the left of the screen onto your dashboard, rearranging them as needed into the order that you want.
- If you don't have any charts yet, click Create Chart to create a custom chart.
- After you add a chart to your dashboard, you have the option to open it for editing or click the X to remove it from your dashboard.
- Once you are satisfied with your custom dashboard, click Save at the top next to the name. You are redirected to a view of your saved custom dashboard.

When you view your custom dashboard, note that you have a date range picker in the upper left corner that allows you to adjust the time period for all the charts on your dashboard.
Exporting Analytics Data
You can export your analytics data from the charts displayed on your Overview Dashboard or your Custom Dashboard. On either dashboard, click the export icon to download a .csv file with the data for that chart.
Note that the data that you download reflects the selection of the filtering options offered in the upper left corner of your dashboard. However, if you are exporting chart data from the Overview Dashboard and you have selected one or more subsections of a chart, the export files do not reflect that selection – instead any export always contains the full data for that chart without considering the chart-level filters that you may have applied.
Subscribe to:
Posts (Atom)