LLBLGen Pro, the #1 n-tier generator and O/R mapper tool

I am using this amazing tool for couple of my  current projects and it seems to be really a great tool and reduces lots of our development time and indeed increases the productivity a lot.

LLBLGen Pro generates a complete data-access tier and business objects tier for you (in C# or VB.NET), utilizing powerful O/R mapping technology and a flexible task-based code generator, allowing you to at least double your productivity!

LLBLGen Pro comes with a state-of-the-art visual O/R mapping designer to setup and manage your project.

LLBLGen Pro has a task-based code generator framework, which uses task-performing assemblies to perform task definitions, called task performers. One of the shipped task performers is a template parser/interpreter which produces code using the template set you specify. All customers have access to the SDK which contains the full source code for all shipped task performers, information how to produce your own task performers and, for example, how to extend the template language or write your own templates. Another task performer can handle templates written in C# / VB.NET and offers flexible access to every object in the loaded LLBLGen Pro project. Customers have access to our free Template Studio IDE for creating / editing / testing templates and templatesets.

It has four building blocks for your code: Entities, Typed Lists, Typed Views and Stored Procedure calls.

Entities, which are elements mapped onto tables or views in your catalog(s)/schema set. All, or a subset of the fields in a table or view are mapped on fields in an entity. An entity can be a subtype of another entity, through inheritance, and in which it derives from its supertype the supertypes fields, relations etc. Furthermore an entity has also fields mapped on its relations, so for example you can have the field ‘Orders’ mapped onto the relation Customer – Order in the entity ‘Customer’. Customer.Orders returns all Order entity objects in a collection filtered on that customer. Order.Customer returns a single customer entity object, related to that order entity. Entities and all the relations between them (1:1, 1:n, m:1 and m:n) are determined automatically from a catalog/schema set. You can add your own relations in the designer, for example to relate an entity mapped onto a table to an entity mapped onto a view or when there are no foreign keys defined in the database schema. You can map as much entities on a table/view as you want.

Typed Lists are read-only lists based on a subset of fields from one or more entities which have a relation (1:1, 1:n or m:1), for example a typed list CustomerOrder with a subset of the fields from the entities Customer and Order.

Typed Views are read-only view definitions which are 1:1 mapped on views in the catalog/schema set.

Stored Procedure calls are call definitions to existing stored procedures. This way you can embed existing stored procedures in your code and you don’t lose investments in current applications with stored procedures (SelfServicing) Per entity type an entity collection is defined, which sports a rich set of functionality to work with one or more entities of the entity type the collection is related to, directly.

Click here to download the 30 days trial version of the software.

Here is the steps to generate 

Step 1 : Create an LLBLGen Pro project

I start with creating the LLBLGen Pro project first. It doesn’t really matter much, but as we don’t have to come back to the LLBLGen Pro designer after this step, it’s easier to do it as the first thing in this little project. 

After we’ve created the project with a few clicks in the LLBLGen Pro designer, it looks something like this:

step1_createproject_tn.gif

 As we don’t need anything fancy, like inheritance, we can leave it at this and simply move on to step 2.
Step 2 : Generate the code

Generating code is also a simple procedure. Pressing F7, selecting some parameters to meet our goals for the code and we’re set. LLBLGen Pro v2.0 has a redesigned code generation setup mechanism, though as I’d like to keep it under wraps for now till the release date, to keep a bit of an advantage over the competition.  For this article, the details about the code generation process aren’t that important anyway.

I’ve chosen for C# as my target language, .NET 2.0 as target platform and ‘adapter’ as the paradigm to use for my code. LLBLGen Pro supports two different persistence paradigms: SelfServicing (which has the persistence logic build into the entities, e.g. customer.Save()) and Adapter, which has the persistence logic placed in a separate class, the adapter, and which makes it possible to write code which targets multiple databases at the same time, and to work with the entities without seeing any persistence logic. This has the advantage to keep logic separated into different tiers for example (so GUI developers can’t make shortcuts to save an entity, bypassing the BL logic).

After the code’s been generated, which took about 4 seconds, it’s time to start Visual Studio.NET 2005 and start some serious application development!

Step 3 : Setting up the VS.NET project

In the folder in where I’ve generated my code, I’ve created a folder called ‘UdaiTest’ and have created a virtual directory on the Windows XP IIS installation’s default site called LLBLGenProTest which points to that Site folder. I load the two generated projects (Adapter uses two projects, one for database generic purposes, which can be shared among multiple database specific projects, and a database specific project) into VS.NET and create a new website using the vanilla ‘Your Folder Is The Website’-project type as it shipped with VS.NET 2005, using the file system.

After the references have been setup correctly, and I’ve added a web.config file to the website project.

In the above step generated database specific project, an app.Config file is generated which contains our connection string. I copy over the appSettings tag with the connection string to the web.config file of my site and it’s now ready for data-access.

Ok, everything is setup, without a single line of typing, and we’re now ready to create webforms which actually do something. On to step 4!

Step 4: Build a page that uses our LLBLGen Pro layer

As we’re working with entities, I think it’s a good opportunity to show off the LLBLGenProDataSource controls. In this step I’ve dragged an LLBLGenProDataSource2 control onto my form and opened its designer to configure it for this particular purpose. The ‘2’ isn’t a typo, the LLBLGenProDataSource control is used by SelfServicing, the LLBLGenProDataSource2 is used by Adapter. The ‘2’ suffix is used for adapter since the beginning, so to keep everything consistent with already familiar constructs, the ‘2’ is used for suffix here as well.

LLBLGen Pro’s datasource controls are really powerful controls.

using System;
using Northwind.EntityClasses;
using Northwind.HelperClasses;
using SD.LLBLGen.Pro.ORMSupportClasses;

public partial class _Default : System.Web.UI.Page
{
    protected void Page_Load(object sender, EventArgs e)
    {
        if(!Page.IsPostBack)
        {
            // set initial filter and sorter for datasource control.
            _customerDS.FilterToUse =
                    new RelationPredicateBucket(CustomerFields.Country == “INDIA”);
            _customerDS.SorterToUse =
                    new SortExpression(CustomerFields.CompanyName | SortOperator.Ascending);
        }
    }
}
_customersDS is my LLBLGenProDataSource2 control, which is the datasource for my grid. Running this page will give the following results:

Not bad for 2 lines of code and some mousing in the designer.  now let’s move on to something more serious, projections and subsets!
Step 5 : Data Shaping and Projections

In the world of today, we don’t have nice features like the things which come with Linq, as there are the anonymous types. So if we want to create a projection of a resultset onto something else, we either have to define the container class which will contain the projected data up-front or we’ve to store it in a generic container, like a DataTable. As we’re going to databind the resultset directly to a grid, a DataTable will do just fine here. Though, don’t feel sad, I’ll show you both methods here: one using the DataTable and one using the new LLBLGen Pro v2 projection technology which allows you to project any resultset onto any other construct using generic code.

DataTable approach
But first, the DataTable using approach. The query has two scalar queries inside the select list: one for the number of orders and one for the last order date. Because of these scalar queries, it’s obvious this data can’t be stored in an entity, as an entity is mapped onto tables or views, not dynamicly created resultsets. So in LLBLGen Pro you’ll use a dynamic list for this. This is a query, build from strongly typed objects which are the building blocks of the meta-data used by the O/R mapper core, and which resultset is stored inside a DataTable. This gives the following code. In the Page_Load handler, I’ve placed the following:
// create a dynamic list with in-list scalar subqueries
ResultsetFields fields = new ResultsetFields(6);
// define the fields in the select list, one for each slot.
fields.DefineField(CustomerFields.CustomerId, 0);
fields.DefineField(CustomerFields.CompanyName, 1);
fields.DefineField(CustomerFields.City, 2);
fields.DefineField(CustomerFields.Region, 3);
fields.DefineField(new EntityField2(“NumberOfOrders”,
    new ScalarQueryExpression(OrderFields.OrderId.SetAggregateFunction( AggregateFunction.Count),
    (CustomerFields.CustomerId == OrderFields.CustomerId))), 4);
fields.DefineField(new EntityField2(“LastOrderDate”,
    new ScalarQueryExpression(OrderFields.OrderDate.SetAggregateFunction( AggregateFunction.Max),
    (CustomerFields.CustomerId == OrderFields.CustomerId))), 5);

DataTable results = new DataTable();
using(DataAccessAdapter adapter = new DataAccessAdapter())
{
    // fetch it, using a filter and a sort expression
    adapter.FetchTypedList(fields, results,
            new RelationPredicateBucket(CustomerFields.Country == “INDIA”), 0,
            new SortExpression(CustomerFields.CompanyName | SortOperator.Ascending), true);
}

// bind it to the grid.
GridView1.DataSource = results;
GridView1.DataBind();

By now you might wonder how I’m able to use compile-time checked filter constructs and sortexpression constructs in vanilla .NET 2.0. LLBLGen Pro uses operator overloading for this, to have a compile-time checked way to formulate queries without the necessity of formulating a lot of code. A string-based query language is perhaps for some an alternative but it won’t be compile time checked, so if an entity’s name or a fieldname changes, the compiler won’t notice it and your code will break at runtime. With this mechanism it won’t as these name changes will be spotted by the compiler.

LLBLGen Pro v2’s projection approach

I promissed I’d also show a different approach, namely with projections. For this we first write our simple CustomerData class which will contain the 6 properties we’ve to store for each row. It’s as simple as this:
public class CustomerData
{
    private string _customerId, _companyName, _city, _region;
    private int _numberOfOrders;
    private DateTime _lastOrderDate;

    public string CustomerId
    {
        get { return _customerId; }
        set { _customerId = value; }
    }

    public string CompanyName
    {
        get { return _companyName; }
        set { _companyName = value; }
    }
   
    public string City
    {
        get { return _city; }
        set { _city = value; }
    }
   
    public string Region
    {
        get { return _region; }
        set { _region = value; }
    }

    public int NumberOfOrders
    {
        get { return _numberOfOrders; }
        set { _numberOfOrders = value; }
    }

    public DateTime LastOrderDate
    {
        get { return _lastOrderDate; }
        set { _lastOrderDate = value; }
    }
}

For our query we simply use the same setup as we’ve used with the DataTable fetch, only now we’ll specify a projector and a set of projector definitions. We furthermore tell LLBLGen Pro to fetch the data as a projection, which means as much as that LLBLGen Pro will project the IDataReader directly onto the constructs passed in using the projector objects. As you can see below, this is generic code and it’s a standard approach which can be used in other contexts as well, for example by projecting in-memory entity collection data onto different constructs. The projectors are all defined through interfaces so you can create your own projection engines as well. One nice thing is that users will also be able to project stored procedure resultsets to whatever construct they might want to use, including entity classes. So fetching data with a stored procedure into a class, for example an entity, will be easy and straightforward as well.

Ok back to the topic at hand. The code to fetch and bind the resultset using custom classes looks as follows:
// create a dynamic list with in-list scalar subqueries
ResultsetFields fields = new ResultsetFields(6);
// define the fields in the select list, one for each slot.
fields.DefineField(CustomerFields.CustomerId, 0);
fields.DefineField(CustomerFields.CompanyName, 1);
fields.DefineField(CustomerFields.City, 2);
fields.DefineField(CustomerFields.Region, 3);
fields.DefineField(new EntityField2(“NumberOfOrders”,
    new ScalarQueryExpression(OrderFields.OrderId.SetAggregateFunction( AggregateFunction.Count),
    (CustomerFields.CustomerId == OrderFields.CustomerId))), 4);
fields.DefineField(new EntityField2(“LastOrderDate”,
    new ScalarQueryExpression(OrderFields.OrderDate.SetAggregateFunction( AggregateFunction.Max),
    (CustomerFields.CustomerId == OrderFields.CustomerId))), 5);

// the container the results will be stored in.
List<CustomerData> results = new List<CustomerData>();

// Define the projection.
DataProjectorToCustomClass<CustomerData> projector =
        new DataProjectorToCustomClass<CustomerData>(results);
List<IDataValueProjector> valueProjectors = new List<IDataValueProjector>();
valueProjectors.Add(new DataValueProjector(“CustomerId”, 0, typeof(string)));
valueProjectors.Add(new DataValueProjector(“CompanyName”, 1, typeof(string)));
valueProjectors.Add(new DataValueProjector(“City”, 2, typeof(string)));
valueProjectors.Add(new DataValueProjector(“Region”, 3, typeof(string)));
valueProjectors.Add(new DataValueProjector(“NumberOfOrders”, 4, typeof(int)));
valueProjectors.Add(new DataValueProjector(“LastOrderDate”, 5, typeof(DateTime)));

using(DataAccessAdapter adapter = new DataAccessAdapter())
{
    // let LLBLGen Pro fetch the data and directly project it into the List of custom classes
    // by using the projections we’ve defined above.
    adapter.FetchProjection(valueProjectors, projector, fields,
            new RelationPredicateBucket(CustomerFields.Country == “USA”), 0,
            new SortExpression(CustomerFields.CompanyName | SortOperator.Ascending),
            true);
}

// bind it to the grid.
GridView1.DataSource = results;
GridView1.DataBind();

It’s a bit more code as you’ve to define the projections, but it at the same time has the nice aspect of having the data in a typed class (CustomerData) instead of a DataTable row.
Enjoy Programming

Will write more on this later…

Alert :: Kind of interview questions you may face here after from the interview panels

Guys,

Please note since the IT industry standards are getting improved a lot and the increase of competitive developers  around the world ,here after you can’t simply surive with by just knowing ‘What is the difference between DataSet and DataReader’ or ‘What is the purpose of viewstate’ ? .

Nowadays IT companies expects you to be more dynamic and excellent in logical thinking as well rewind yourself back to the stages how you got prepared to clear  your GATE and campus interviews at Infy,CTS. Gradudally the companies are recruiting  and showing  more interest  on engineering background guys rather than arts college students (like MCA,M.Sc,B.sc). Look at for e.g some sample questions  which is recently asked by Microsoft and CTS like big shots… Its time to read logical thinking and brain teasers try Shakuntala Dhevi,Sandy Silverthorne,Howard Black

Question: What is the method you will do to check if a linked list contains a cycle in it, and, at what node the cycle starts?

Answer: There are a number of approaches. The approach I shared is in time N (where N is the number of nodes in your linked list). Assume that the node definition contains a boolean flag, bVisited.

struct Node
{
  …
  bool bVisited;
};
Then, to determine whether a node has a loop, you could first set this flag to false for all of the nodes:

// Detect cycle
// Note: pHead points to the head of the list (assume already exists)
Node *pCurrent = pHead;
while (pCurrent)
{
  pCurrent->bVisited = false;
  pCurrent = pCurrent->pNext;
}
Then, to determine whether or not a cycle existed, loop through each node. After visiting a node, set bVisited to true. When you first visit a node, check to see if the node has already been visited (i.e., test bVisited == true). If it has, you’ve hit the start of the cycle!

bool bCycle = false;
pCurrent = pHead;
while (pCurrent && !pCycle)
{
  if (pCurrent->bVisited == true)
    // cycle!
    pCycle = true;
  else
  {
    pCurrent->bVisited = true;
    pCurrent = pCurrent->pNext;
  }

A much better approach was submitted by 4Guys visitor George R., a Microsoft interviewer/employee. He recommended using the following technique, which is in time O(N) and space O(1).
Use two pointers.
// error checking and checking for NULL at end of list omitted
p1 = p2 = head;

do {
 p1 = p1->next;
 p2 = p2->next->next;
} while (p1 != p2);
p2 is moving through the list twice as fast as p1. If the list is circular, (i.e. a cycle exists) it will eventually get around to that sluggard, p1.
——————————————————————————–

Question: Write a code script to inverse a double linked list?

Answer: This problem isn’t too hard. You just need to start at the head of the list, and iterate to the end. At each node, swap the values of pNext and pPrev. Finally, set pHead to the last node in the list.

Node * pCurrent = pHead, *pTemp;
while (pCurrent)
{
  pTemp = pCurrent->pNext;
  pCurrent->pNext = pCurrent->pPrev;
  pCurrent->pPrev = temp;
 
  pHead = pCurrent;

  pCurrent = temp;
}
——————————————————————————–

Question: Consider you have an array that contains a number of strings (perhaps char * a[100]). Each string is a word from the dictionary. Your task, described in high-level terms, is to devise a way to determine and display all of the anagrams within the array (two words are anagrams if they contain the same characters; for example, tales and slate are anagrams.)

Answer: Begin by sorting each element in the array in alphabetical order. So, if one element of your array was slate, it would be rearranged to form aelst (use some mechanism to know that the particular instance of aelst maps to slate). At this point, you slate and tales would be identical: aelst.

Next, sort the entire array of these modified dictionary words. Now, all of the anagrams are grouped together. Finally, step through the array and display duplicate terms, mapping the sorted letters (aelst) back to the word (slate or tales).

——————————————————————————–

Question:Based on the following prototype:

int compact(int * p, int size); 
try write a method that will take a sorted array, possibly with duplicates, and compact the array, returning the new length of the array. That is, if p points to an array containing: 1, 3, 7, 7, 8, 9, 9, 9, 10, when the function returns, the contents of p should be: 1, 3, 7, 8, 9, 10, with a length of 5 returned.

Answer: A single loop will accomplish this.

int compact(int * p, int size)
{
  int current, insert = 1;
  for (current=1; current < size; current++)
    if (p[current] != p[insert-1])
    {
      p[insert] = p[current];
      current++;
      insert++;
    } else
      current++;
}

My New Blog Exclusively for .NET 3.0 and 3.5 Lovers

Guys,

After I have received lots of comments and requests from the blog and group members and blog visitors to write exclusilvely on .NET 3.0 and VS.NET 2008 an excellent whoomping technology which is going to rule the future software industry,I have decided to create a new blog to write explicit articles on .NET 3.0 , 3.5  and VS.NET 2008 please continue giving your supports and active in the new blog as well.

I can give you 100% assurance this blog could be an excellent resource for guys whom wants to upgrade thier skills on .NET 3.0 ,3.5 technologies.

Click here (http://dotnet3atthebest.blogspot.com/) to enter inside the exciting world,Go and ahead and add it under your favourites.

WSS 3.0 In Place Upgradation – Content Databases

I have been installing the WSS 3.0 based on in-place upgradation last night and did some detailed R&D on the way the WSS manages the content databases.

Managing Content Databases

Basically Microsoft Windows SharePoint Services uses a database to store and manage site content. Just as each virtual server can host multiple top-level Web sites, each virtual server can rely on multiple content databases to store site content. If you are running Windows SharePoint Services on a single server, hosting just a few sites, you can probably use the same content database for all of your sites. If you want to add capacity in a server farm, you will most likely need several content databases to store site data for each virtual server.

To make it easier to manage site content for large server farms, you can also set a limit on how many top-level Web sites can store content in a content database. You can specify a warning limit and a maximum limit for the number of sites. When a warning limit or maximum limit is reached, an event is logged in the server’s NT Event Log, so you can take action. When a maximum limit is reached, no more sites can be created using that content database.

When you create a new site, the databases are queried and the new site’s content is added to the database which has the most available space. For example, suppose your virtual server has three content databases, all set to warn you when they reach 2000 sites, with a maximum of 2025 sites. When the first content database reaches 2000 sites, an event is logged. When it reached 2025 sites, no more sites can be created in that database. When you are close to the limit on two out of three of the content databases, and you know that you’ll need to host more than 2000 additional sites, it is time to create another content database.

You can specify any number of sites for the warning and maximum number of sites. To determine an appropriate number for your situation, divide the amount of available disk space on the database server by the estimated size for each site (plus a buffer). If you are using quotas, divide the disk space by the disk space quota (plus a buffer).

A buffer allows the number of sites to grow beyond the warning level, but not exceed your disk space. The size of the buffer is up to you, but make sure to provide enough space for growth, so that you don’t exceed the maximum number before you can react to a warning event. When the maximum number is reached, no more sites can be created in that content database. Be sure to create a buffer large enough so that your users can continue to create sites as required, without having to constantly create new content databases.

Content databases are created and managed at the virtual server level. When you create a new content database (or when you extend a virtual server), you specify the database connection settings for the content database. You can update these settings if, for example, the database server name changes.

You can create or delete content databases, and specify settings such as the database server to use for the content and how many top-level Web sites to allow per content database in a server farm setting, by using pages in HTML Administration. In HTML Administration, you can view the full list of content databases for your virtual server, and see the current, warning and maximum level of sites for the content database at a glance.
 

.NET Framework 3.5 in CTP

Brief on .NET Framework 3.5

Many ISV’s, enterprises and even Microsoft product teams are successfully building on the new features WF, WCF, WPF and CardSpace in the .NET Framework 3.0. Microsoft plans to continue to invest in the .NET Framework developer platform and in support of existing users the .NET Framework 3.5 has no serious breaking changes so existing applications built for .NET Framework 2.0 or .NET Framework 3.0 will continue to execute. The .NET Framework 3.5 adds new features in several major technology areas.

  1. Integration of Language Integrated Query (LINQ) and data awareness
  2. Support for Web 2.0 AJAX style applications and services in ASP.NET and WCF
  3. Full tooling support for WF, WCF and WPF including the new workflow-enabled services technology
  4. New classes in the base class library (BCL) for the .NET Framework 3.5 address the most common customer requests

.NET Framework 3.5 ships with Visual Studio codename ”Orcas” and will be available for separate download from MSDN.

Just The Server Side (WF and WCF)

Here’s some detail of the new things to look for from WF and WCF.

Workflow enabled services – process and messaging together

Web 2.0 AJAX friendly (works with ASP.NET AJAX Client) and REST enabled WCF services

New project templates and other new features in Visual Studio for WF and WCF

More WS-* Standards Support including WS-AtomicTransaction, WS-ReliableMessaging, WS-SecureConversation and WS-Coordination

RSS and ATOM Syndication Support in WCF

Partial Trust for WCF applications deployed through click-once Rules Data Improvements

Got my certification ! Get recognized by Microsoft

Wrote the exams for MCTS: 70-541 WSS 3.0, Application Development in November 2nd , and passed with a score of 966 followed by an email from MS congratulating me on passing my exams!

So there we go, I am officially certified for MOSS 2007 – although I must say that I would have preferred for a simulation exam rather then a multiple choice which anyone with a parrot memory could pass. Actually you need some experience with the product in order to pass.

I have opened a new page in my blog to help you guys to know more about the various Microsoft certifications read from there and earn your certifications. https://bestofcyber.wordpress.com/microsoft-certifications/

Steps to Install SharePoint 2007

I did a MOSS 2007 install last week at  Ashish office and I Just want to do a recap post for all of you guys on the same as most of the guys become little more mumbling when it come to MOSS2007.

I’ve got two servers running Win 2K3 R2 in my domain.

To install IIS, you need the i386 folder off of the Server 2K3 installation disc.

I install MOSS on one and on the other, I run the Office SharePoint Server install and, of course I get the error:

Setup is unable to proceed due to the following error(s):
-This product requres .Net Framework 2.0 to install.
-This product requires at least Windows Workflow Foundation version 3.0.4203.2, which is part of the .Net Framework 3.0
-This product requires ASP.NET v2.0 to be set to ‘Allow’ in the list of Internet Information Services (IIS) Web Server Extensions.  If it is not available in the list, re-install ASP.NET v2.0.
Correct the issue(s) lsited above and re-run setup.

Now, I have to apologize to your people because I know I had covered each of these issues in detail, but we’re still looking to recover some of my old posts and, since you know robots have less than perfect memory, this is about all I can recall:

The .Net 2.0 install requirement can be stupid in that it can be installed but you have to uninstall it and reinstall it.

The WWF install is essentially a .Net 3.0 install.

Once you take care of those issues, the “Allow” configuration is probably resolved as well.

So, let’s see what we have to do…

I’m a little fuzzy on the difference between the .Net Framework 2.0 and ASP.Net v2.0 but I go to here:

http://www.microsoft.com/downloads/details.aspx?FamilyID=0856EACB-4362-4B0D-8EDD-AAB15C5E04F5&displaylang=en

And download dotnetfx.exe and run it.

“Set up is continuing the install.”  That’s great.

That’s almost as good as the dialog box in the SQL Server 2005 install that says “Setup is proceeding to continue with the install.”

The WWF install is really a .Net 3.0 install which you can get from here:

http://www.microsoft.com/downloads/details.aspx?FamilyId=10CC340B-F857-4A14-83F5-25634C3BF043&displaylang=en

So when those installs are completed, I run setup again, enter my product key and click next.

I tell it I want a complete install as opposed to a web-front end or stand-alone install.  It shows me an installation progress bar and it moves slowly to the left.

Then we get the success dialog box with a check mark to run the Configuration Wizard now.

The wizard fires up…

I dot the “No, I want to create a new farm.” option.

I specified my database server and my service account for connecting to the database server.  I went to my database server and created the login for my service account and gave it dbcreator and securityadmin roles.

I specified my Central Admin port number to my favorite: 63696 (use your number keypad) and NTLM authentication.

Reviewed the settings and clicked Next.

Now, it’s counting nine tasks with the little green floater moving repeatedly from left to right.

1, 2…, 3, 4, 5, 6…, 7.., 8, 9

Configuration Successful! and click Finish.

The Central Administration page loads.

So we’ve got these administrative tasks in a list.

Task One: Read the tasks – kind of like that old elementary school trick where they tell you to read all the instructions before you start.  It says to delete the item from the list when you’re done but I’ll just mark it completed.

Task Two: Add Servers to Farm – This is done.

Task Three: Assign Services to Servers – This one has an action link.  I click on the link and it takes me to the page in Operations.  Now I’m on a medium farm so that option is dotted and a set of services appear in the table below like a connectable web part. To see all the services, you have to dot the Custom option or select the All view.

You have to start the Document Conversions Load Balance Service first, because you need a load balancer server to run the launcher service.  The Load Balancer service starts and then you can start the Document Conversion service where you’ll select your Load Balancer server.  It chooses port 8082 for whatever reason.

I start the Excel Calculator.

I start MOSS search checking the options to Use this server for indexing content and Use this server for serving search queries.  I drop my own eMail address in the eMail address field for some reason and use my service account for crawling and set performance to Maximum

Finally, I start WSS Seach adding my service account and password in the service and content accounts and leave the rest unchanged.  All my services are started on this server.  I edit the task item and mark it completed.

Task Four: Configure Server Farm’s Share Services – This one has an action link to Applications Management.  This page has been trouble in the past so I’m a little trepidatious.  The first issue is the ShareService Provider and it needs a web application so I click Create a new web application. 

Now this page wants to create SharePoint – 80 on port 80, create a matching folder in the Virtual Servers folder and a new application pool and restart IIS.  I’ve been right here before and didn’t want to run SharedServices through port 80. But, I’m click OK all the same.  Now, I think this will just provision an IIS web site.  I do not have to use it for SharedServices.

So, this is taking forever.

Finally, it resets and it’s taking me back to the New Shared Service Provider page where my SharePoint – 80 web application is selected.  I’m going to create another web application.

Again, we’re taking forever.

Top 5 reasons to develop applications using VS.NET 2008

Earlier this year Visual Studio celebrated its tenth anniversary. As microsoft move towards releasing Visual Studio 2008, let’s take a moment to reflect on the product’s evolution.

The first release of Visual Studio in 1997 featured separate IDEs (that required their own installation) for Visual C++, Visual Basic, J++, and a tool known as InterDev. Visual Studio 6.0 was a dramatic improvement that marked the birth of Visual Basic 6 and embodied the idea of a set of unified services across all languages.

With Visual Studio .NET 2002 and Visual Studio .NET 2003, this vision was realized with the .NET Framework. For the first time an individual developer could write an application in the language of their choosing while taking advantage of a common set of tools including designers, drag and drop controls, and IntelliSense. Along with the increase of individual developer productivity was an increase in the size and complexity of development projects and teams.

Visual Studio 2005 was born to help developers in teams of any size increase collaboration and reduce development complexity. With each progressive release, Microsoft has reaffirmed its commitment to empowering the developer by creating a dialogue with the community to help incorporate feedback and improve the product. Visual Studio 2008 is no exception. With your help Microsoft is prepared to deliver on the commitment to make every software project successful on the Microsoft platform.

 Build Next-Generation Applications

Visual Studio 2008 is a unified toolset that enables developers and development teams to build great applications on the Microsoft platform. Support for Windows Vista and the Microsoft Office system development assist developers in building compelling rich client applications. Now, with included support for ASP.NET AJAX and the Silverlight Add-in for Visual Studio 2008, developers can also build a spectrum of rich interactive applications for the Web. As the Microsoft platform further increases in capabilities through the delivery of Windows Server 2008 and SQL Server 2008, Visual Studio 2008 will continue to be the single environment that developers and development teams need to be successful.

Harness the power of Microsoft Office (Visual Studio Tools for Office)

With Visual Studio 2008 developers can easily target the more than 500 million users of Microsoft Office while using the same managed code skills that they’ve developed for writing Microsoft Windows applications or ASP.NET applications. As an integrated component of Visual Studio 2008 Professional Edition, Visual Studio Tools for Office (VSTO) enables developers to customize Word, Excel, PowerPoint, Outlook, Visio, InfoPath, and Project to improve end user productivity. Whether building Office UI-based workflow solutions, custom add-ins, or Microsoft Office SharePoint Server solutions, Visual Studio provides the tools to give the developer a RAD development experience.

Stand out with Windows Vista (WPF,WWF,CF)

Visual Studio is the ideal environment for building applications that have the Windows Vista look and feel. Development teams of any size building applications targeting the next generation user experience will be able to create, edit, debug, and deploy Windows Presentation Foundation applications in Visual Studio 2008. Visual Studio enables a developer building a WPF application to edit XAML directly (with IntelliSense support) or create the user interface through the new visual designers. A change made to the layout of an application through one of these tools is reflected immediately in the other. Additionally, Visual Studio provides support for taking advantage of more than 8,000 new native APIs in Windows Vista.

Developers building distributed applications will find that creating and consuming web services with Windows Vista technology is a great experience. Visual Studio makes it easy for you to implement a Web service using Windows Workflow Foundation. You can test this service without writing a single line of code and consume or expose this service from an existing workflow.

Build interactive Web experiences (Scott Web Experience)

The Microsoft Web platform is an end-to-end offering for designing, developing, and hosting applications on the Web. Visual Studio 2008 provides tools that span the entire platform from the secure, reliable, and extensible infrastructure of IIS, through the amazing client-side experience of Silverlight, and everything in between. Developers will be able to take advantage of rich client-side and server-side frameworks to easily build client-centric Web applications. These applications can integrate with any backend data provider, run within any modern browser and have complete access to ASP.NET application services and the Microsoft platform.

Visual Studio.NET 2008 Shell  

If you create software development tools, you’ll want to consider building on the Visual Studio 2008 Shell. A streamlined Visual Studio development environment, the Visual Studio Shell provides the core foundation so you can focus on building your application’s unique features. Flexible customization options help you deliver optimized experiences for specific markets.

Key Benefits

Faster Development. The Visual Studio Shell accelerates development by providing a base integrated development environment that can host custom tools and programming languages.

A Familiar Environment. Developers can build on the Visual Studio platform and provide end users a familiar user interface, speeding the learning curve for both.

Optimized for Languages & Tools. Created in response to requests from our partners, the Visual Studio Shell gives you the option of integrating your tools with Visual Studio or creating an isolated, custom-branded application.
  

Visual Studio Shell (integrated mode)

Optimized for Programming Languages

Applications built on the integrated Shell will automatically merge with any other editions of Visual Studio installed on the same machine.

This is the Visual Studio Shell (integrated mode) running Iron Python.

vsnet2008small.jpg

Visual Studio Shell (isolated mode)
Optimized for Specialized Tools

Applications built with the isolated Shell will run side-by-side with any other editions of Visual Studio installed on the same machine.

This is built on the Visual Studio Shell (isolated mode).

vsnet-storyboarddesigner_isolated_large.jpg

Microsoft Announces DAISY for Word

Microsoft has made another affirmation of its commitment to providing the visually impaired with useful software, announcing yesterday morning it is developing a plug-in for Microsoft Word that translates documents into DAISY XML, a standard for digital talking books

DAISY works by creating a digital audio file which narrates the document’s content that maps to text. Refreshable Braille displays comprised of digitally-activated pins are also made to correspond with DAISY files, so the reader can know how words are spelled, or to promote quicker content scanning.