Wikipedia

Search results

Wednesday, 18 June 2014

Building APIs around Microsoft Azure SQL Databases

In this post we are going to discuss a few emerging trends within computing including cloud based Database platforms, APIs and Integration Platform as a Service (iPaaS) environments.  More specifically we are going to discuss how to:
  • Connect to a SQL Database using Mule ESB  (for the remainder of this post I will call it Azure SQL)
  • Expose a simple API around an Azure SQL Database
  • Demonstrate the symmetry between Mule ESB On-Premise and its iPaaS equivalent CloudHub
For those not familiar with Azure SQL,
it is a fully managed relational database service in Microsoft’s Azure
cloud.  Since this is a managed service, we are not concerned with the
underlying database infrastructure.  Microsoft has abstracted all of
that for us and we are only concerned with managing our data within our
SQL Instance.  For more information on Azure SQL please refer to the
following
link.


Prerequisites

In order to complete all of the steps in this blog post we will need a Microsoft Azure account, a MuleSoft account and MuleSoft’s AnyPoint Studio – Early Access platform.  A free Microsoft trial account can be obtained here and a free CloudHub account can be found here.
To enable the database connectivity between MuleSoft’s ESB platform and
Azure SQL we will need to download the Microsoft JDBC Driver 4.0 for which is available here.
New.png


Provisioning a SQL Instance


  1. From the Microsoft Azure portal we are going to provision a new Azure SQL Database by clicking on the + New label.




  1. Click on Data Services – SQL Database – Quick Create.
QuickCreate.png

  1. Provide a DATABASE NAME of muleAPI, select an existing SERVER and click CREATE SQL DATABASE.  Note if you do not have an existing SQL Server you can create a new one by selecting New SQL database server from SERVER drop down list. 
MuleAPI.png

  1. After about 30 seconds we will discover that our new Azure SQL instance has been provisioned







  1. Click on the muleapi label to bring up the home page for our database.







  1. Click on the View SQL Database connection strings link.







  1. Note the JDBC connection information. We will need this later in the Mule ESB portion of this blog post.

  1. Next, we want to create a table where we can store our Product list.  In order to do so we need to click on the Manage icon.

  1. We will be prompted to include the IP Address from the computer we are using.  By selecting Yes
    we will be able to manage our Database from this particular IP Address.
     This is a security measure that Microsoft puts in place to prevent
    unauthorized access.

  1. A new window will open where we need to provide the credentials that we created when provisioning a new Azure SQL Server.

  1. We will be presented with a Summary of our current database.  In order to create a new table we need to click on the Design label.

  1. Click on the New Table label and then provide a Table Name of Products. We then need to create columns for ProductName, Manufacturer, Quantity, and Price.  Once we have finished adding the columns, click the Save icon to commit the changes.

  1. We now want to add some data to our Products table and can do so by clicking on the Data label.  Next we can add rows by clicking on the Add row label.  Populate each column and then click on the the Save icon once all of your data has been populated.


This concludes the Azure SQL portion of the walk through.  We will now focus on building our Mule Application.



Building Mule Application


  1. The first thing we need to do is to create a new Mule Project and we can do so by clicking on File – New – Mule Project.

  1. For the purpose of this blog post we will call our project SQLAzureAPI.  This blog post will take advantage of some of the new Mule 3.5 features and as a result we will use the Mule Server 3.5 EE Early Access edition and click the Finish button to continue.

  1. A Mule Flow will automatically be added to our solution.  Since we want to expose an API, we will select an HTTP Endpoint from our palette.

  1. Next, drag this HTTP Endpoint onto our Mule Flow.

  1. Click on the HTTP Endpoint and set the Host to localhost, Port to 8081 and Path to Products. This will allow Mule ESB to listen for API request at the following location: http://localhost:8081/Products
 

  1. Next, search for a Database Connector within our palette.  Notice there are two versions; we want the version that is not deprecated.

  1. Drag this Database Connector onto our Mule Flow next to our HTTP Endpoint.

  1. As mentioned in the Pre-requisites  of this blog post, the Microsoft JDBC Driver
    is required for this solution to work.  If you haven’t downloaded it,
    please do so now.  We need to add a reference to this driver from our
    project.  We can do so by right mouse clicking on our project and then
    selecting Properties.

  1. Our Properties form will now load.  Click on Java Build Path and then click on the Libraries tab. Next click on the Add External JARs button.  Select the sqljdbc4.jar file from the location where you downloaded the JDBC driver to and then click the OK button to continue.

  1. We are now ready to configure our Database Connection and can do so by clicking on our Database Connector so that we can specify our connection string. Next, click on the green plus (+) sign to create a new Connector configuration.

  1. When prompted, select Generic Database Configuration and click the OK button.

  1. In the portion of this blog post where we provisioned our
    Azure SQL Server Database, it was suggested to make a note of the JDBC
    Connection string.  This is the portion of the walk through where we
    need that information.  We need to put this value in our URL
    text box.  Please note that you will need to update this connection
    string with your actual password as it is not exposed on the Microsoft
    Azure portal.  For the Driver Class Name find com.microsoft.sqlserver.jdbc.SQLServerDriver  by clicking on the … button.  Once these two values have been set, click on the Test Connection button to ensure you can connect to Azure SQL successfully.  Once this connection has been verified click on the OK  button.
Note: For this blog post, the connection
string was embedded directly within our Mule Flow configuration.  In
certain scenarios this obviously is not a good idea.  To learn about how
these values can be set within a configuration file, please visit the
following link or see the section below when we deploy our application to CloudHub.

  1. With our connection string verified, we need to specify a
    query that we want to execute against our Azure SQL Database.  The Query
    that we want to run will retrieve all of the products from our Products
    table.  We want to add the following query to the Parameterized query Text box: Select ID, ProductName,Manufacturer, Quantity, Price from Products

  1. For the purpose of this API, we want to expose our response
    data as JSON.  In the Mule ESB platform this is as simple as finding an
    Object to JSON transformer from our Palette.

  1. Once we have located our Object to JSON
    transformer we can drag it onto our Mule Flow.  It is that easy in the
    Mule ESB platform; no custom pipeline components or 3rd party libraries
    are required for this to work.  In the event we want to construct our
    own JSON format we can use the AnyPoint DataMapper to define our own format and transform our Database result into a more customized JSON structure without any custom code.
This concludes the build of our very simple
API. The key message to take away is how easy it was to build this with
the Mule ESB platform and without any additional custom coding.

Testing our Mule Application


  1. For now we will just deploy our application locally and can do so by clicking on Run – Run As – Mule Application.

  1. To call our API launch Fiddler, a Web Browser or any other HTTP based testing tool and navigate to http://localhost:8081/Products.  As you can see the contents of our Azure SQL Database are returned in JSON format…pretty cool.
Not done….
One of the benefits of the Mule ESB platform is that there is complete
symmetry between the On-Premise version of the ESB and the Cloud
version.  So what this means is we can build an application for use
locally, or On-Premise. If we decide that we do not want to provision
local infrastructure we can take advantage of MuleSoft’s managed
service.  There are no
code migration wizards or tools required in order to move an application
between environments.  We simply choose to deploy our Mule Application
to a different endpoint.
In MuleSoft’s case we call our Integration Platform as a Service (iPaaS) – CloudHub.  There are too many details to share in this post so for the time being, please visit the CloudHub launch page for more information.


Deploying to CloudHub


  1. In order to deploy our application to CloudHub there is one configuration change that we need to make.  Within in our src/main/app/mule-app.properties file we want to specify a dynamic port for our HTTP Endpoint.  Within this file we want to specify http.port=8081.

  1. Next, we want to update our HTTP Endpoint to use this dynamic port.  In order to enable this macro we will click on our HTTP End point and then update our Port Text box to include ${http.port}.
    This will allow our application to read the port number from
    configuration instead of it being hard coded into our Mule Flow.  Since
    CloudHub is a multi-tenant environment we want to drive this value
    through configuration.

  1. With our configuration value set, we can now deploy to CloudHub by right mouse clicking on our Project and then selecting CloudHub – Deploy to CloudHub.  Note: For
    the purpose of this blog post we have chosen to deploy our application
    from our IDE.  While this may be the only way to deploy your application
    in some other platforms,  this is not the only way to perform this
    deployment with MuleSoft.  We can also deploy our application via the
    portal or through a continuous integration process.

  1. We now need to provide our CloudHub credentials and provide some additional configuration including Environment, Domain, Description, and Mule Version.  For this blog post I am using the Early Access Edition but prior versions of Mule ESB are capable of running in CloudHub.
     Also note that our dynamic port value has been carried over from our
    local configuration to our CloudHub configuration.  Once we have
    completed this configuration we can click on the Finish button to being our Deployment.

  1. Within a few seconds we will receive a message indicating
    that our Application has been successfully uploaded to CloudHub.  Now
    this doesn’t mean that it is ready for use, the provisioning process is
    still taking place.

  1. If we log into our CloudHub portal we will discover that our application is being provisioned.
 

  1. After a few minutes, our application will be provisioned and available for API calls.

  1. Before we try to run our application, there is one more
    activity that is outstanding.  Earlier in this walk through we discussed
    how Microsoft Azure will restrict access to the Azure SQL Databases by
    providing a Firewall.  We now need to ‘white list’ our CloudHub IP
    Address in Microsoft Azure.  To get our CloudHub IP Address, click on Logs and then set our All Priorities drop down to be System.  Next look for the lines that indicate our “… Your application has started successfully.”  Copy this IP Address and then log back into the Azure Portal.

  1. Once we have logged back into the Microsoft Azure Portal, we need to select our MuleAPI database and then click on the DASHBOARD label.  Finally, we need to click on the Manage allowed IP Addresses link.

  1. Add a row to the IP Address table and include the IP Address from the CloudHub logs and click the Save icon.
NOTE: A question that you may be asking yourself is:
what happens if my CloudHub IP Address changes?  The answer is you can
provision a CloudHub instance with a Static IP Address by contacting
MuleSoft Support.  Another option is to specify a broader range of IP
Addresses to ‘white list’ within the Microsoft Azure portal.  Once
again, MuleSoft Support can provide some additional guidance in this
area if this is a requirement.


 

Testing our API


  1. We can now test our API that is running in MuleSoft’s
    CloudHub instead of our local machine.  Once again, fire up Fiddler or
    whatever API tool you like to use and provide your CloudHub URL this
    time.  As you can see our results are once again returned in JSON format
    but we are not using any local infrastructure this time!

Telemetry

A mandatory requirement of any modern day Cloud platform
is some level of visibility of the services that are being utilized.
 While the purpose of this post is not to get into any exhaustive
detail, I did think it would be interesting to briefly display the
CloudHub Dashboard after our API test.
Similarly we can also see some Database analytics via the Microsoft Azure portal.

Conclusion

In this blog post we discussed a few concepts including:

  • Connecting to a Microsoft Azure SQL Database using MuleESB

  • Exposing a simple API around our Azure SQL Database

  • Demonstrate the symmetry between Mule ESB On-Premise and its iPaaS equivalent
These different concepts highlight some of the popular trends within
the computing industry.  We are seeing the broader adoption of Cloud
based Database platforms.  We are then seeing an explosion of APIs being
introduced and finally we are seeing the evolution of Integration Platforms
as a Service offerings.  As demonstrated in this blog post, MuleSoft is
positioned very well to support each of these different scenarios.
Another important consideration is we were only ‘scratching the surface’
when it comes to some of these features that are available in the
MuleSoft platform.  For instance this post didn’t even touch on our
comprehensive Anypoint Platform for APIs.  Our AnyPoint Platform for APIs provides full life cycle support for Designing and Engaging, Building  and Running/Managing APIs. If this sounds interesting, sign up for a free trial account and give it a try.