We recently deployed Telerik’s Sitefinity platform to an Azure website with a backend SQL Azure database. The existing information available for releasing Sitefinity to Azure relies on the web frontend being hosted in an Azure Cloud Services Web Role rather than an Azure website. The existing information for this type of deployment I have found is missing steps and contains troublesome areas that made the setup difficult to complete. By walking through getting Sitefinity into an Azure website with a SQL Azure backend, it should help simplify setup and avoid some of the pitfalls that inherent to the process.

For this walk through, you will need to have access to the following:

  1. Sitefinity Project Manager (which can be downloaded from sitefinity.com site).
  2. Visual Studio
  3. A SQL Server that is not in Azure SQL  (either a local installation or a standard server install of SQL Server).

Create Sitefinity Instance

The first step in creating a Sitefinity instance that will work in Azure is just like creating any other Sitefinity instance using the Sitefinity Project Manager. If you are unfamiliar with this process, visit http://docs.sitefinity.com/create-projects for information to get you started. 

Initial Database Setup

Once you have created your new Sitefinity instance, you will need to configure the initial SQL Server that the database will be created in. Unfortunately, the Sitefinity UI does not yet allow the database to be created directly in SQL Azure. Because of this, you will need to select “Microsoft SQL Server” and create the database in a standard Microsoft SQL Server (either locally or on a remote server).

sitefinity-database-setup

Once the database has been created, you will want to run some quick tests to make sure you are starting with a good install of Sitefinity. I would suggest navigating around the Admin area a bit and creating a few sample pages. Once you are satisfied that your installation is working correctly, you can proceed to migrating your database to your SQL Azure instance.

Migrate Database to SQL Azure

There are several ways to go about migrating the database that Sitefinity created to SQL Azure. My preference is using the tools built into SQL Management Studio. Using SQL Azure Migration Wizard (https://sqlazuremw.codeplex.com/) is also an effective way to accomplish the same goal.

In order to use the built in tools in SQL Management Studio you will need to follow the steps below:

  1. Open SQL Management Studio and connect to the server where you had Sitefinity create its database.
  2. Right click on the database and select Tasks -> Deploy Database to Windows Azure SQL Database.
  3. You will be presented a screen where you can populate all of the information for your SQL Azure instance. Populate your information and continue to the end of the wizard.

Once this has completed you are ready to move onto getting your local Sitefinity instance running using the new SQL Azure database.

Updating Configs for SQL Azure

Once you have successfully created the new SQL Azure database, there are two configuration files in your Sitefinity solution that will need to be modified. Open the solution in Visual Studio and modify the config files as described below:

  1. Open the file SitefinityWebApp/App_data/Sitefinity/Configuration/DataConfig.config. (you may need to show files not included in the project to see this).
  2. Replace the connection string with the correct connection string for your Azure database.
  3. Update the “dbType” attribute from “MsSql” to “SqlAzure”.
  4. Open the web.config file.
  5. You should see two sections that looks like these: 
    Web.config section commented

    web-config-section-commented

  6. Uncomment these sections so they look like this
    Web.config section uncommented

    web-config-section-uncommented

Theoretically, after this has been completed, you should be able to run your Sitefinity instance locally connected to the Azure database. However, I actually found that I was getting an error due to an incompatible version of the Windows Azure SDK. You can find the currently supported versions of the SDK at http://docs.sitefinity.com/reference-windows-azure-sdk-supported-versions. Currently, the latest version of Sitefinity requires version 2.6 of the Azure SDK. This was an older version of the SDK than was installed on my system. To get the older version, I needed to install Visual Studio 2012 on my PC so that I could use the installer provided on the same page. These steps may vary depending upon your current system setup.

Once the correct version of the Azure SDK is installed, you should be able to run your Sitefinity site locally and connect to your Azure database.

Publishing Sitefinity to Azure Website

The final steps to getting Sitefinity running in an Azure website are pretty quick. Because Azure websites have up-to-date versions of the Azure SDK installed, if we were to directly publish the site now, we would get the same error we would have locally with the incorrect version of the Azure SDK. To resolve this, we really only need one dll from the installed SDK.

  1. Navigate to the installation location of your Azure SDK (for me this was C:\Program Files\Microsoft SDKs\Azure).
  2. Inside that folder go to .NET SDK\v2.6\bin\runtimes\base\x64.
  3. Copy the file msshrtmi.dll into your Sitefinity project and add a reference to it.

In addition to this, I have found that it is necessary to set all references to Copy Local = true. This can be done with the following steps:

  1. Expand the “References” section of your project.
  2. Select all references that appear.
  3. Right click and select “Properties”.
  4. Change the “Copy Local” property to True.

Once these changes are made you are ready to publish your site to your Azure website. This can be done using any method that is currently available to publish to an Azure website. No special configuration should be required beyond this point.

Big Data is an inescapable buzzword for anyone even remotely entrenched in the world of business or technology in 2016.  With the phrase being rather ubiquitous and generic without a proper context, it might be hard to understand exactly what this movement is all about.  Big Data as it exists today is a growing and emergent field.   Technology is constantly being introduced to the world that either creates new data streams or a new way to make sense of the current data being produced.  What, exactly, should we do with all this information?  That is the central problem and proposed solution of Big Data; to capture, analyze, and make sense of the wealth of data at our disposal. 

A Retrospective

The first recorded attempts to quantify and explain large and exponentially growing sets of data came in the 1940s when Freemont Rider forecasted exponential growth for libraries.  Libraries had been doubling each year, and at the rate he observed he predicted that library staffs would require an outrageous number of employees to maintain the volumes.  In 1961 Derek Price makes a similar observation about the growth of recorded scientific knowledge, and also states that this information grows exponentially as each scientific discovery spawns another series of discoveries.  It seems that placing an emphasis on capturing and understanding data and overcoming the complexities involved in that process are not new ideas. 

In 1990 Peter Denning asked the question “What machines can we build that will monitor the data stream of an instrument, or sift through a database of recordings, and propose for us a statistical summary of what’s there? … it is possible to build machines that can recognize or predict patterns in data without understanding the meaning of the patters. Such machines may eventually be fast enough to deal with the large data streams in real time … With these machines, we can significantly reduce the number of bits that must be saved, and we can reduce the hazard of losing latent discoveries from burial in an immense database.”  The idea of making sense of Big Data in a modern way isn’t a brand new idea either.  The blueprint of the modern notion and science of Big Data has seemingly been around for a few decades. 

The Present

Today, data originates from an increasing number of places.  Data can come from our internet browsers, PCs, mobile devices, fit bits, cameras, microphones, and wireless networks.  There are new sources that create and transmit data being introduced to the market all the time.  In examining all the places where data could come from, it seems like an endless trough.  Individuals and companies want to be able to harness and make sense out of all this data.  The purpose would be to observe patterns, analyze trends, and hopefully predict outcomes based on past interactions.  Every company in every industry can see a value in being able to capture and quantify the increasingly available streams of data produced by our electronic devices.     

In 2015, 90% of organizations reported investments in Big Data initiatives, and two-thirds of those organizations claimed that these initiatives have a measurable impact on revenue.  In the current climate, organizations see Big Data and analytics as a way to gain a competitive advantage.  Companies that are capitalizing on Big Data initiatives the most are looking beyond transactional data and using the data they collect in another way such as creating new business models or monetizing data to external companies.  The most used types of data are location data and text data.  This is Big Data as we see it today: leveraging technology to analyze new data streams, and fine-tune business models based on the findings. As we have already addressed, this isn’t a new concept, just the latest iteration. 

Moving Forward

A concern with Big Data moving forward is the idea of privacy and how much of it we are willing to compromise in the pursuit of gathering more and more data.  Analyzing data that is being captured passively or without the consent of the individual brings up privacy concerns, although it could provide some very useful insights into trends and behaviors.  The Organization for Economic Cooperation & Development’s privacy guidelines also dictate that data should be discarded once its original purpose is achieved.  The idea of discarding this data goes against a central tenant of Big Data.  Besides the primary problems facing Big Data, which is the ability to capture all the data and having a means of making sense of it all, privacy concerns for individuals and companies are in the forefront of issues to be addressed. 

Other questions for Big Data moving forward are where will data sets come from and how we will capture and analyze them.  With technology changing at an ever increasing rate it’s almost impossible to forecast the what and how of Big Data initiatives.  What is likely to remain true is that the amount of data we will produce will continue to increase, and we will continue to place a premium on gathering and understanding that data.  The philosophy of Big Data is an old one; gather data and analyze it in a way that presents some sort of competitive advantage.  The execution of the initiatives is something that is fluid, dynamic, and challenging.

Sources: