Feeds:
Posts
Comments

Introduction:

SSAS is having a nice & useful feature of defining the calculations (Calculated Members & Named Sets). These calculations can then be used in MDX or by any other client tools etc. These calculations come very handy to allow the user to use them to analyse their data with required parameters, attributes, & custom calculations. Using Named Sets, we can predefine the member(s) which are based upon certain conditions and then user can directly use them to see the measure/data against those members.

Let’s say we want to offer the user with a set of last 5 days/dates which contains transactional data, even though there may be lot of other later dates but with no fact data.

So for this we can create a regular Named Set on Calculation tab, something like:

 CREATE SET ActualSalePeriods AS

Tail

(      NonEmpty([Date Order].[Calendar].[Month].[Date].Members,

                        Measures.[Sales Amount]),

5);  

 Every calculated member/named set created here is session scoped & created with keyword ‘Create’ behind the scene.

 These Named Sets are evaluated and created every time the cube is processed so at the time of next process, if cube gets data with new dates, it will refresh the result set of this create Named Set as well accordingly. So we can say, they are static is nature and contents will remain intact until cube is reprocessed.

Dynamic Named Set:

These regular Named Sets are Static in Nature (only possible option till 2005 version) to improve the query performance by not automatically refreshing/querying it against cube every time. And in fact, works well in the situations like getting latest 5 dates as mentioned above since as in when new dates get added with data in the cube, it will be reprocessed & above set would also be updated with latest members then.

However, this may result a serious issue with respect to the actual requirement which (let’s say) is to get the latest Top 10 employees with highest sales on monthly basis. Created Named Set (static), evaluated at the time of processing, will give the correct result set with those employees. But now in case user wants to slice/dice the result set based upon some other dimension say region, product etc, again user will be offered with the same result set generated earlier at the time of processing, i.e. wrong data information.

 To cope up with such issues/requirements, SSAS 2008 has come up with one more option for creating the Named Set. It’s named as ‘Dynamic’. So developer can either select Static (default) to have the regular Named Set or can select Dynamic to give the Named Set a dynamic behaviour. And by selecting Dynamic option, SSAS will add a keyword ‘Dynamic’ before SET keyword to make it dynamic named set.

Dynamic sets are not calculated once. They are calculated before each & every query, and very important, in the context of that’s query WHERE clause and sub-selects. 

Thus the above requirement of getting Top 10 sales employees will work absolutely fine with Dynamic Named Set.

 Imp: Dynamic named set not only resolves the above mentioned issue but also adds capability to MDX which was not able to generate the desired results with regular named set. This is in regards with the issue while using the multi-select dimensions in the sub-select rather than where clause (which is case with excel wherein it always converts where to sub-select). So using Dynamic sets, multi-select statements are evaluated properly even being used in sub-select statements.

Problem:

Since populating the data warehouse or any other database may involve various disparate source-types like SQL Server, Excel spreadsheets, DB2 etc. It can also be a Flat File/CSV file source as well. 

In case of data sources like SQL Server, Oracle wherein we generally get the data by querying the tables, views etc, we can actually put the logic to remove the duplicate probably by selecting the DISTINCT values or so. But in case of a Flat File data source we don’t have much control in terms of querying the data from it and still we want the duplicate records to be removed before loading them into the target database/file.

 Solution:

Let’s take an example where we have a flat file with four columns (A,B,C,D) and some records (100,200,300,400 etc) as shown below:

 Source File

 

 

 

 

And now say we want all of the columns to be loading into the target database, but with duplicate records (lines) removed.

Approach:

 

 

 

 

 

 

 

 

 

Steps:

1. Configure the Flat File Source as we required by selecting the file, delimiters (columns & header) etc.

2. Now drag ‘Aggregate’ transformation following Flat File Source and configure it to have ‘Group By’ on the deciding columns/fields for the record to be unique.

3. Configure destination by mapping the columns from Aggregate transformation to the target database/file.

Run the package, and you will see no duplicate record inserted into the target database/file.

Dest File

Introduction:

Dundas has come up with the adding new abilities to Reporting Services (SSRS) in terms of more featured & customizable controls. Not only they look better with lot of UI options to customize, but also have powerful features which are very interesting and useful for analysis. The most popular among these is Dundas Chart control. Existing SSRS charts can also be into Dundas chart just by few right clicks.

Though there are lot of in-built options to select which are quite enough to fulfil the reporting requirements, it also lets developer to apply his/her imagination in designing the chart with the help of writing custom code. And this can be done using the Code Editor of the Dundas control. It provides some event handlers wherein .net (C#/Vb.net) code can be put as per the requirement e.g. PostPaint, PostApplyData, CustomizeChart, CustomizeLegend etc. These are called at some specific point in time & serve a specific purpose.

Drawing a line on Dundas chart for some specific values:

Here we will try to understand the use of Code Editor in Dundas chart with the help of a very simple example. There may be some business need to show a line or any shape based upon certain data conditions on the chart. You need to be a little bit familiar with the .net coding as well.

We will take a simple dataset with some dummy values. 

DataSet: DS_Chart
————————————————
SELECT     1000 AS Value, 1 AS Rate
UNION
SELECT     1400 AS Value, 1.5 AS Rate
UNION
SELECT     2100 AS Value, 2 AS Rate
UNION
SELECT     1450 AS Value, 2.75 AS Rate
UNION
SELECT     2300 AS Value, 3 AS Rate
UNION
SELECT     3400 AS Value, 3.5 AS Rate
UNION
SELECT     2340 AS Value, 4 AS Rate
UNION
SELECT     4440 AS Value, 4.5 AS Rate
UNION
SELECT     4200 AS Value, 5.5 AS Rate
UNION
SELECT     3400 AS Value, 6 AS Rate
UNION
SELECT     9600 AS Value, 6.5 AS Rate
UNION
SELECT     5300 AS Value, 7 AS Rate
UNION
SELECT     5600 AS Value, 9 AS Rate
 Order By Rate
————————————————

The idea is to draw a line chart wherein value (Y-axis) will change overt the rate (X-axis). First we will configure the Dundas Chart control for a basic chart requirement.

  1. Right click the control & select properties.
  2. Give a title to the chart & go to Data tab.
  3. Select ‘DS_Chart’ as dataset.
  4. Add a Value series and there select
    1. rate on X-axis
    2. sum(value) on Y-axis.

  1. 5. Add one CategoryGroup with Expression as Rate & label as also Rate.

6. Click Ok. Go to Preview tab to see the report. It will look a like as:

Now adding the custom code where 3 < Rate > 4 & Value > 3000.

 7. Go to properties and click on Advanced tab.

  1. 8. Click on ViewCode button there.
  2. 9. Select PostPaint event & put the below code inside the code editor.

// Chart Graphics object

ChartGraphics graphics = e.ChartGraphics;              

 

// Check what chart element fired the event

if(sender is Series ){

      Series series = (Series) sender;

               

      //Can also check for the series name using series.Name property

            double xValue;

        double yValue;

       

        // Data points loop

      foreach( DataPoint point in series.Points ){

     

        // Get relative position of the point, which will be the center of the ellipse.

      xValue = graphics.GetPositionFromAxis( “Default”, AxisName.X, point.XValue);

      yValue = graphics.GetPositionFromAxis( “Default”, AxisName.Y, 0);

      // Get absolute point.

      PointF absPoint = graphics.GetAbsolutePoint( new PointF( (float)xValue, (float)yValue ) );

       //Draw a line for data

       if(point.YValues[0] > 3000 && (point.XValue > 3 && point.XValue < 4))

       {

            Pen blackPen = new Pen(Color.Black, 3);

            PointF endPoint = graphics.GetAbsolutePoint( new PointF((float)xValue, (float)40) );

            graphics.DrawLine(blackPen, absPoint,endPoint);

       }

    

      }   

 }

 

10. Click Ok.

11. Now go to Preview tab to see the report. It will look similar to the following:

As shown, a line has been drawn at the specified condition’s location.

Custom Data Processing 

 There are many components of SQL Server reporting services which can be customized as per the requirements & one of them is how data is being fetched/processed. Though, SSRS provides quite a good number of inbuilt data providers to connect to various types of data sources (inbuilt data processing extensions), there may be a need to do some custom processing before the data actually gets delivered to SSRS report. 

Also sometimes one may have requirement to merge/join data from multiple data sources (say from a RDMS database & from a cube), which at the first sight looks like a limitation of SSRS. But there is what it offers the facility to extend the services by implementing a Custom Data Extension. 

 Well by CDE, it means you can write your own custom code (in .net) which will take the dataset query as its input and will give you the result-set as per the custom logic implemented in the CDE code. So basically doing so we are extending/customizing the ‘Data Processing Extension’ 

 For creating the custom libarary for it there are certain interfaces which need to be implementing. 

These include:   

  • IDbConnection
  • IExtension
  • IDbCommand
  • IDataParameter
  • IDataParameterCollection
  • IDataReader
  • IDbTransaction

Details about Required Extensions:  http://msdn.microsoft.com/en-us/library/ms153967.aspx 

Data Extension Process Flow  

Before developing your custom data extension, you should understand how the report server uses data extensions to process data. You should also understand the constructors and methods that are called by the report server. 

 The step-by-step process flow of a data extension that is called by the report server 

 

Also there are certain dll references to be added in the library solution: 

  • Microsoft.ReportingServices.DataExtension
  • Microsoft.ReportingServices.Interfaces

Implemention of various Interfaces and their methods/properties: 

Class  Implements Interface   Description 
Connection class  IDbConnection, IExtension  This is the entry point of the data extenson. 
Methods – IExtension    
SetConfiguration(string config)  For getting the connections defined in RSReportDesigner.config and using them in the code. The whole xml string gets passed from SSRS. 
Methods – IDbConnection    
Open()   Called at the beginning to open the required db connections. 
CreateCommand()  Called to create an object of IDbCommand to form the command to be executed. 
Close()  To close the connections at the end. 
Command Class  Implements Interface    
   IDbCommand, IDbCommandAnalysis (optional)   This class takes care of preparing the command based upon the commandtext/query passed from the report dataset & retruns the resultset in the form of DataReader. 
Methods – IDbCommand    
CreateParameter()  Creates a new instance of an IDbDataParameter which takes care of parameters valuations. 
ExecuteReader()  To prepare the resultset based upon the parameters and the query passed. 
CommandText : Property  This gives the command/query passed from report dataset. 
Methods – IDbCommandAnalysis    
IDataParameterCollection GetParameters()  This method creates a collection (IDataParameterCollection) of IDataParameters. We can also write our own logic here to prepare the IDataParamCollection and return. 
Parameter Class  Implements Interface    
   IDataParameter  This contains some functions like ‘Value’ to get the value of a parameter. 
Parameter Collection  Implements Interface    
   IDataParameterCollection   This contains the function like ‘Add’ to add the parameters to a collection and contains properties like ‘IndexOf’ to get/set the index in the collection. 
Data Reader Class  Implements Interface    
   IDataReader   This class comes into picture when ‘ExecuteReader’ method is being executed. This method works in conjection with the DataReader class to form a resultset. 
Methods – GetName(), GetOrdinal(), GetValue() & Read() etc    
Transaction Class  Implements Interface    
   IDbTransaction  To handle transactions. 

Once the data extension library (.dll) is compiled, it has to be placed into foler at: C:\Program Files\Microsoft Visual Studio 8\Common7\IDE\PrivateAssemblies 

Registering the CDE with report designer & making SSRS report datasource able to access the CDE (RSReportDesigner.Config & RSReportServer.config) 

 (Generic names are used here)

<Extension Name=MyExtension Type=NameSpacename.ClassImplementingIDbConnection, AssemblyName>

<Configuration>

<connections>

<description><environment>MyCustomExtension </environment></description>

<connectionstring>

    <name>connectionname1</name>

    <value>Data Source=servername;Initial  Catalog=DBName;Provider=MSOLAP.2;</value>

</connectionstring>

</connections>

</Configuration>

</Extension>

There can be n number of connection defined, all wil be passed to CDE as a XML string.

 Now when creating a Data Source in SSRS, this extension will be available to be selected as a type of connection:

When we use Excel (Excel source in case of excel 2003 & OLEDB Source in case of excel 2007) as our data source in SSIS, we come across the fact that SSIS decides over the data type of a column of excel sheet based upon the contained metadata. Data source scans the metadata of the excel sheet and determine the data type of columns.

But in this process, there are certain other parameters or settings which also play an important role in this and once decided; all other values with other data type in the same column are imported as NULL.

Example: If we have string and numbers values in the same column then either it will take it as string or number while configuring the excel data source. But again if it considers it as number then string values would be imported as NULL and in case no. of string values are more (in first 8 cells) then it would take string data type which would allow both the values to be imported.

And interesting point to be noted here is, even if we forcefully select ‘DT_WSTR’ (string) data type in the first case, then also it would behave the same.

There are the following settings which cause this unwanted output & these are also the ways to configure them to have the correct results:

  • Setting IMEX=1 in the connection of the Excel sheet. This can be set in the connection string in case of Excel Source (Excel 2003) and can be set using ‘Extended Properties’ in case of OLE DB Source (Excel 2007). This setting tells that excel sheet is having mixed data types and thus import it as ‘Text’ values.

Then we can forcefully select ‘DT_NTEXT’ data type in the data source. 

  • Registry setting TypeGuessRows=8: This setting/value tells the connection manager to consider the first 8 rows of a column and decide over its data type based upon the maximum no. of values type in those 8 values. So if we have first 5 rows/cells of a column of type string but rest of the 100 values are numbers then it would import all the numbers as NULL since data type in this would be String (DT_WSTR). This will be applied for all the packages i.e. on server level.

So if we use a higher value for TypeGuessRows (say 20) then it would take it as Integer.

But again here also it would import rest of the string values as NULL. 

  • Using ‘SELECT * FROM [SheetName$]’ query instead of direct table/sheet name. This will allow all type of values to get imported as ‘DT_WSTR’ which is quite acceptable because anyhow it’s getting all the values of a column. This is really a good option to select.

Note: In all the cases where in we are getting string data (along with number data) be means of some or the other setting, we have to have String column in the destination.

Requirement:

Generally we deal with Flat Files which are comma (,) separated i.e. CSV files. And in SSIS by making use of ‘Flat File Source’ component we can read them very easily. These files can contain results/data from various sources like IBM, Mainframe, SAP etc. They may also be manually created or generated through some custom applications. And thus due to some specific requirements or limitations; they may have data which is not comma (,), bar (|), semi colon (;) etc delimited. These are among the few possible options available in Flat File connection manager used while reading the Flat Files in SSIS.

Apart from that there may be the requirement of having data containing comma (,) in their values itself. And thus we cannot just replace all the delimiters like dollar or # with a comma to have a comma separated file.

So to make it possible & read the records from a Flat File which can have any known separator for its column values, we need to do some kind of exercise explained below.

 Solution:

Since we don’t have any inbuilt option to select ‘#’ or ‘$’ etc as a delimiter for Flat File in connection manager, we would take the whole line/record into one column and then would split out all the values as per the present delimiter, using a Script Component.

Let’s take an example:

We have a .txt (Flat) file (say InputFile.txt) with three columns A, B and C. These headers are comma separated but the values under these columns are ‘#’ delimited.

Data into the file:

A,B,C

100#200#300

500#600#700

 Steps for the SSIS Package:

  1. Take a Data Flow Task (DFT).
  2. On the data flow tab, Drag one ‘Flat File Source’, edit it and click on ‘Flat File Connection Manager -> New’.
  • Name it as ‘SourceFileConnectionManager’.
  • Select the ‘Inputfile.txt’ in the File Name column by browsing the file.
  • Select Format as ‘delimited’.
  • Select ‘Header row delimiter’ as ‘Comma {,}’.
  • Select the check box for ‘Column names in the first row’.

1

   

 

 

 

 

 

 

 

 

 

  

 

  • Go to Column tab of this connection manager.
  • Select ‘{CR} {LF}’ as Row delimiter.

Screen would look like as:

2

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

  

As you can see all the data is now coming into a single column named ‘A,B,C’. 

  • Click OK.
  • Go to ‘Columns’ tab of the Flat File Source.
  • Rename the Output column as ‘A’.

3

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

  • Click Ok.

 3. Drag a Script Component & select to use it as a Transformation while adding it.

  • Point the output of Flat File Source to this script component.
  • Edit its properties and select “A” as an input column there.
  • Go to ‘Input and Output’ tab and add three output columns (col1, col2 & col3) to the existing output ‘Output 0’. Select ‘four byte signed integer’ as their data type.

4

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

  • Modify the ‘SyncronizeInputId’ of Output0 to 0. This is to make the script component asyncronized.

5

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

  • Go to ‘Script’ tab and click on ‘Script Designer..’ to edit the script (VB.Net).
  • Select ‘Pre Compile’ as False.
  • Write the following script in the ‘Input0_ProcessInputRow’ event handler:

         Dim values() As String

        values = Row.A.Split(“#”.ToCharArray)

         Output0Buffer.AddRow()

        Output0Buffer.Col1 = Convert.ToInt32(values(0))

        Output0Buffer.Col2 = Convert.ToInt32(values(1))

        Output0Buffer.Col3 = Convert.ToInt32(values(2)) 

  • Close this and Click Ok.
  1. Take one Flat File Destination and with the following shown setting configure its connection manager (DestinationFlatFileConnection).

6

 

 

 

 

 

 

 

 

 

 

 

7

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

  • Go to Advanced tab and modify the name of the columns from col1, col2 & col3 to A, B & C.
  • Go to the ‘Mappings’ tab of the flat File Destination and do the below shown column mappings:

8

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Now execute the package to have the following output in the ‘OutputFile.txt’ file.

 A,B,C

100,200,300

500,600,700

———————————————————————————————–

Introduction

In this world of globalization, everyone wants to have projects, reports etc being accessed by users across countries/cultures around the world. And thus one needs to support SSRS reports in multiple languages.
Different-2 applications provide multi-lingual support in different-2 manners. Though, SSRS does not have inbuilt support for this, we can achieve it using some code based technique.

Background

Since, in general report contains numeric data which are to be analyzed based upon some textual headers/groups. So to actually say that a report has multi-lingual support, all that textual information should be dynamically changing as per the culture/language of end user.
And since these fields basically are the labels/textboxes displaying textual information, we actually have to manage these labels/textboxes for multi-language support.
Also we must have some mechanism by which we can detect the language/UI culture of end user and can present the page/report in an acceptable format.

Solution:

In any way of its implementation, the resource information as what should be displayed in what language /culture, it has to be stored somewhere. This can be maintained in database or in a resource file etc. The structure of the resource storage table would be like:

Field Name Value Language
Column1 Total Amount En-US
Column1 Somme totale French

Now while displaying this column1 on the report, rather than hard coding its value, at runtime we have to detect the language of interest & display the corresponding value. We can make use of the global variable of SSRS report ‘User!Language’ to get the current language been set in the client’s browser and then pass it to the data table.
This can be done either by using a custom code (DLL/Assembly) in which we can have a function and this function will return the related value of fields.

Function: GetValue(fieldname, language)

Or we can use dynamic query & directly make join to the resource table in our main data set for all the required fields. In this query we can then pass the language retuned by User!Language.

Steps:
1. Create a hidden parameter (CurrentLanguage) of type string.

Lang_Param

 

 

 

 

 

 

 

 

 

 

 

 

 

 2. Use expression to assign a non-queried default value to this parameter as =User!Language

3. Use dynamic query for the main data set as:

  SELECT ColumnName, measure1, measure2

FROM MainTable MT JOIN resourceTable RT

ON MT.FieldName = RT.ColumnName (say ColumnName = Column1)

AND RT.Language = @CurrentLanguage

 

Now at runtime @CurrentLanguage will get the language of the client & display the lables & all in the correct language.

  

Follow

Get every new post delivered to your Inbox.

Join 37 other followers