While trying to reproduce an issue in Azure Reporting, I found myself building a simple worker role that generated a report using the ReportViewer control in server mode. I found a couple gaps in the overall content available, so I thought I would try to post a more complete example here.
NOTE: I am assuming that you know how to create an Azure deployment and an Azure Reporting instance, plus design and publish a basic report.
The first thing I had to do was create a basic report. The report and the datasource look like this:
I then published this report and the associated data source to my Azure Reporting Instance using the built-in BIDS functionality.
------ Deploy started: Project: Report Project1, Configuration: DebugLocal ------
Deploying to https://igwbloe2yk.reporting.windows.net/reportserver
Deploying data source '/Data Sources/DataSource1'.
Deploying report '/SubReportRepro/BaseReport'.
Deploying report '/SubReportRepro/MasterReport2'.
Deploy complete -- 0 errors, 0 warnings
Next, I created a Windows Azure Worker Role project. Because Azure Reporting is protected by Forms Authentication, I had a to add a custom class to manage the user credentials. Although I modified the code a bit so I didn’t have to hardcode the credentials, it is pretty much identical to the MSDN documentation on this class. However, because the MSDN code sample is missing the Using statements, here is the complete code:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using Microsoft.Reporting.WebForms;
using System.Net;
using System.Security.Principal;
namespace WebRole1
{
/// <summary>
/// Implementation of IReportServerCredentials to supply forms credentials to SQL Reporting using GetFormsCredentials()
/// </summary>
publicclass ReportServerCredentials : IReportServerCredentials
{
privatestring _user;
privatestring _password;
privatestring _authority;
public ReportServerCredentials(string user, string password, string authority)
{
_user = user;
_password = password;
_authority = authority;
}
public WindowsIdentity ImpersonationUser
{
get
{
returnnull;
}
}
public ICredentials NetworkCredentials
{
get
{
returnnull;
}
}
{
authCookie = null;
user = _user;
publicbool GetFormsCredentials(out Cookie authCookie, outstring user, outstring password, outstring authority)
password = _password;
authority = _authority;
returntrue;
}
}
}
Next, I had to write the worker role code. Again, this code is stock worker role code with the exception of the code inside the Run method. The ReportViewer manipulation code is stock ReportViewer code from MSDN as is the blob storage code.
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;
using System.Net;
using System.Threading;
using Microsoft.WindowsAzure.Diagnostics;
using Microsoft.WindowsAzure.ServiceRuntime;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Auth;
using Microsoft.WindowsAzure.Storage.Blob;
using Microsoft.Reporting.WebForms;
using System.IO;
namespace WorkerRole1
{
publicclass WorkerRole : RoleEntryPoint
{
publicoverridevoid Run()
{
// This is a sample worker implementation. Replace with your logic.
Trace.WriteLine("$projectname$ entry point called", "Information");
while (true)
{
try
{
Trace.WriteLine("Rendering a report", "Information");
//Instantiate an instance of the ReportViewer control
//Since all I am doing is rendering, this is much easier than doing SOAP API calls
Microsoft.Reporting.WebForms.ReportViewer rv = new Microsoft.Reporting.WebForms.ReportViewer();
rv.ProcessingMode = ProcessingMode.Remote;
rv.ServerReport.ReportServerUrl = new Uri(RoleEnvironment.GetConfigurationSettingValue("RSUrl"));
rv.ServerReport.ReportPath = RoleEnvironment.GetConfigurationSettingValue("ReportPath");
rv.ServerReport.ReportServerCredentials = new WebRole1.ReportServerCredentials(RoleEnvironment.GetConfigurationSettingValue("User"), RoleEnvironment.GetConfigurationSettingValue("Password"), RoleEnvironment.GetConfigurationSettingValue("RSUrl").Replace("http://", ""));
Warning[] warnings;
string[] streamids;
string mimeType;
string encoding;
string extension;
byte[] bytes = rv.ServerReport.Render(
"PDF", null, out mimeType, out encoding, out extension,
out streamids, out warnings);
Trace.WriteLine("Writing report to storage");
//first, set up the connection to blob storage
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(Microsoft.WindowsAzure.CloudConfigurationManager.GetSetting("TargetReportStorage"));
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve a reference to a container that already exists
CloudBlobContainer container = blobClient.GetContainerReference(RoleEnvironment.GetConfigurationSettingValue("TargetContainer"));
container.CreateIfNotExists();
// Retrieve reference to a blob named "myblob".
CloudBlockBlob blockBlob = container.GetBlockBlobReference(Guid.NewGuid().ToString() + ".pdf");
MemoryStream fs =
new MemoryStream();
fs.Write(bytes, 0, bytes.Length);
fs.Seek(0, SeekOrigin.Begin);
blockBlob.UploadFromStream(fs);
}
catch (Exception ex)
{
Trace.WriteLine(ex.Message + ex.StackTrace);
}
Thread.Sleep(Convert.ToInt32(RoleEnvironment.GetConfigurationSettingValue("BetweenReportsMS")));
}
}
publicoverridebool OnStart()
{
// Set the maximum number of concurrent connections
ServicePointManager.DefaultConnectionLimit = 12;
try
{
// For information on handling configuration changes
// see the MSDN topic at http://go.microsoft.com/fwlink/?LinkId=166357.
DiagnosticMonitorConfiguration config = DiagnosticMonitor.GetDefaultInitialConfiguration();
// Schedule a transfer period of 30 minutes.
config.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(1.0);
// Display information about the default configuration.
//ShowConfig(config);
// Apply the updated configuration to the diagnostic monitor.
// The first parameter is for the connection string configuration setting.
DiagnosticMonitor.Start("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", config);
}
catch (Exception e)
{
Trace.WriteLine("Exception during WebRole1.OnStart: " + e.ToString());
// Take other action as needed.
}
returnbase.OnStart();
}
privatevoid ShowConfig(DiagnosticMonitorConfiguration config)
{
try
{
if (null == config)
{
Trace.WriteLine("Null configuration passed to ShowConfig");
return;
}
// Display the general settings of the configuration
Trace.WriteLine("*** General configuration settings ***");
Trace.WriteLine("Config change poll interval: " + config.ConfigurationChangePollInterval.ToString());
Trace.WriteLine("Overall quota in MB: " + config.OverallQuotaInMB);
// Display the diagnostic infrastructure logs
Trace.WriteLine("*** Diagnostic infrastructure settings ***");
Trace.WriteLine("DiagnosticInfrastructureLogs buffer quota in MB: " + config.DiagnosticInfrastructureLogs.BufferQuotaInMB);
Trace.WriteLine("DiagnosticInfrastructureLogs scheduled transfer log filter: " + config.DiagnosticInfrastructureLogs.ScheduledTransferLogLevelFilter);
Trace.WriteLine("DiagnosticInfrastructureLogs transfer period: " + config.DiagnosticInfrastructureLogs.ScheduledTransferPeriod.ToString());
// List the Logs info
Trace.WriteLine("*** Logs configuration settings ***");
Trace.WriteLine("Logs buffer quota in MB: " + config.Logs.BufferQuotaInMB);
Trace.WriteLine("Logs scheduled transfer log level filter: " + config.Logs.ScheduledTransferLogLevelFilter);
Trace.WriteLine("Logs transfer period: " + config.Logs.ScheduledTransferPeriod.ToString());
// List the Directories info
Trace.WriteLine("*** Directories configuration settings ***");
Trace.WriteLine("Directories buffer quota in MB: " + config.Directories.BufferQuotaInMB);
Trace.WriteLine("Directories scheduled transfer period: " + config.Directories.ScheduledTransferPeriod.ToString());
int count = config.Directories.DataSources.Count, index;
if (0 == count)
{
Trace.WriteLine("No data sources for Directories");
}
else
{
for (index = 0; index < count; index++)
{
Trace.WriteLine("Directories configuration data source:");
Trace.WriteLine("\tContainer: " + config.Directories.DataSources[index].Container);
Trace.WriteLine("\tDirectory quota in MB: " + config.Directories.DataSources[index].DirectoryQuotaInMB);
Trace.WriteLine("\tPath: " + config.Directories.DataSources[index].Path);
Trace.WriteLine("");
}
}
// List the event log info
Trace.WriteLine("*** Event log configuration settings ***");
Trace.WriteLine("Event log buffer quota in MB: " + config.WindowsEventLog.BufferQuotaInMB);
count = config.WindowsEventLog.DataSources.Count;
if (0 == count)
{
Trace.WriteLine("No data sources for event log");
}
else
{
for (index = 0; index < count; index++)
{
Trace.WriteLine("Event log configuration data source:" + config.WindowsEventLog.DataSources[index]);
}
}
Trace.WriteLine("Event log scheduled transfer log level filter: " + config.WindowsEventLog.ScheduledTransferLogLevelFilter);
Trace.WriteLine("Event log scheduled transfer period: " + config.WindowsEventLog.ScheduledTransferPeriod.ToString());
// List the performance counter info
Trace.WriteLine("*** Performance counter configuration settings ***");
Trace.WriteLine("Performance counter buffer quota in MB: " + config.PerformanceCounters.BufferQuotaInMB);
Trace.WriteLine("Performance counter scheduled transfer period: " + config.PerformanceCounters.ScheduledTransferPeriod.ToString());
count = config.PerformanceCounters.DataSources.Count;
if (0 == count)
{
Trace.WriteLine("No data sources for PerformanceCounters");
}
else
{
for (index = 0; index < count; index++)
{
Trace.WriteLine("PerformanceCounters configuration data source:");
Trace.WriteLine("\tCounterSpecifier: " + config.PerformanceCounters.DataSources[index].CounterSpecifier);
Trace.WriteLine("\tSampleRate: " + config.PerformanceCounters.DataSources[index].SampleRate.ToString());
Trace.WriteLine("");
}
}
}
catch (Exception e)
{
Trace.WriteLine("Exception during ShowConfig: " + e.ToString());
// Take other action as needed.
}
}
}
}
Those of you who are paying close attention might have noticed that I use RoleEnvironment.ConfigurationsSetting(“XXXXX”) for all of my passwords, connection strings, etc. This is handy because it allows me to configure those values at run time instead of design time using standard Windows Azure methods. You can edit these either via the Windows Azure portal in production or in Visual Studio during development. Here’s what the Visual Studio dialog looks like:
Now, here is the tricky part. Because I elected to use the ReportViewer control, I need to ensure that the ReportViewer assemblies are accessible to my Windows Azure role. They aren’t part of the standard Azure deployment so that leaves me with two choices:
- Add a startup task to install the ReportViewer control
- Upload copies to the assemblies as part of my deployment
Option 1 isn’t very difficult, but I wanted to minimize the size of my deployment package, so I elected to go with option 2. The easy part was to make sure the the Copy Local setting of the Microsoft.ReportViewer.Common and Microsoft.ReportViewer.WebForms assemblies was set to True. Doing the same for Microsoft.ReportViewer.DataVisualization and Microsoft.ReportViewer.ProcessingObjectModel was a bit trickier because they live in the GAC. First, I had to manually copy them out of the GAC and into my project folder and then I had to add explicit references the local copies of these assemblies. Lastly, just like the other ReportViewer assemblies, I had the ensure that the Copy Local property was set to True.
Now, after deploying my worker role to Azure using standard techniques I could watch my blob storage and see reports being generated from my worker role.
At this point, I want to take a minute and plug Windows Azure’s scalability. By increasing the number of instances behind my worker role to 50 (just a simple configuration change), I was able to generate more than 60K reports over the course of the next 8 hours. Then, once my testing was done, I deleted the deployment. Try configuring 50 on-premises machines and then finding a new home for them after just 8 hours. You will probably find lots of people who will take them, but good luck getting paid anywhere near the purchase price!