Home » SQL Server

Silverlight 4 and IClientMessageInspector don't give access to the [Set-Cookie] http header


I added a new MessageInspector to my Wcf client runtime (in sl4) to intercept all the replies from the server, and then gain access to http headers.


    public void AfterReceiveReply(ref Message reply, object correlationState)
var httpResponse = (HttpResponseMessageProperty)reply.Properties[HttpResponseMessageProperty.Name];
var httpHeaders = httpResponse.Headers.AllKeys;

It works fine but i don't see any Set-Cookie header in the reply whereas in Fiddler it appears to be here ???

In the httpHeaders variable above there are for example headers like : Cache-Control, Content-Length, Content-Type, X-Powered-By etc..

Do you know why the Set-Cookie http header is not return ?

Thanks in advance



2 Answers Found


Answer 1

There are some headers  in Silverlight which are not exposed in the WebHeaderCollection, so I'm assuming Set-Cookie is one of them. To be able to see the cookies sent by the server, I think you'll need two things:

1. Use the client http  stack (and not the browser stack, which is the default). If you use the browser networking stack, it will take care of the cookies for you, and you won't have access  to them. To set the client  stack, see http://msdn.microsoft.com/en-us/library/dd920295(VS.95).aspx.

2. Use the CookieContainer (which can be set on the client proxy generated by slsvcutil / Add Service Reference) to retrieve the cookies set by the service. The CookieContainer is how you can deal with cookies when using the client HTTP stack.


Answer 2

Hi Carlos,

I use now the CookieContainer and it does the work. Thanks.





We use a Cookie for authentication and in all other odata requests to the server the cookie values are passed.  However, when I call SetSaveStream on the DataServiceContext, then UpdateObject and finally BeginSaveChanges the initial POST request that posts the media stream to the server does not include the Cookie header and of course responds with a 401.


One interesting thing I noticed is that the Referer header on other requests is set to the browser url, while in the case of the stream POST the referer header is set to the absolute url to my XAP file.  This seems strange to me and suggests that this POST request is perhaps being handled differently.


We use a Cookie for authentication and in all odata requests to the server the cookie values are passed as observed in Fiddler.  However, when I call SetSaveStream on the DataServiceContext, then UpdateObject and finally BeginSaveChanges the initial POST request that posts the media stream to the server does not include the Cookie header and of course responds with a 401.

The requests for regular entity queries (which do pass the authentication cookie) have exactly the same scheme, host, domain and path as the media resource POST in which case I believe it should include the same cookies values that are in the browser session.

Any ideas why the media resource stream POST would not include the cookies?


I have been having difficulties with an authentication scenario in a web application I've been trying to build an interface for in Silverlight 3 (and Silverlight 4). I use the ClientHttp Stack to do a 'GET' to a URL that is a protected resource at the server. The server responds with the following:

HTTP/1.1 302 Moved Temporarily
Server: Apache-Coyote/1.1
Cache-Control: no-cache
Pragma: no-cache
Expires: Thu, 01 Jan 1970 00:00:00 GMT
Set-Cookie: JSESSIONID=8F9C660395C70E62EB3B7C65EDAC1BFB; Path=/
Location: http://localhost.:4502/server/login.jsp;jsessionid=8F9C660395C70E62EB3B7C65EDAC1BFB
Content-Length: 0
Date: Fri, 18 Jun 2010 21:20:56 GMT  

The ClientHttp stack redirects and performs the following GET:

GET http://localhost.:4502/server/login.jsp;jsessionid=8F9C660395C70E62EB3B7C65EDAC1BFB HTTP/1.1
Accept: application/xml
Accept-Encoding: identity
User-Agent: Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; .NET4.0E)
Host: localhost.:4502
Connection: Keep-Alive 

If the CookieContainer is interogated, the JSESSIONID cookie is not present. I have to resort to parsing the Location header and setting the cookie myself.

Is this a bug, or am I missing something?

Thanks in advance.




I have implemented my own IClientMessageInspector which I inject as EndpointBehavior when consuming the service operation. Everything works as I expect apart from one thing. On the realisation of the interface method AfterReceiveReply the value Reply.Headers.Action always returns null, whereas the realisation of the method BeforeSendRequest returns the value I would expect for the Action. Can anyone help? 

I have included my very very simple implementation of IClientMessageInspector and the definition of the service operation being invoked.

publicclass ClientBehaviorInstance : IClientMessageInspector
    #region IClientMessageInspector Members

    publicobject BeforeSendRequest(ref Message request, IClientChannel channel)
      string action = request.Headers.Action;


    publicvoid AfterReceiveReply(ref Message reply, object correlationState)
      string action = reply.Headers.Action;



    [OperationContract(Action="Expense/Create", ReplyAction="Expense/Create")]
    ExpenseCreateResponseMessage CreateExpense(ExpenseCreateRequestMessage request);


Note I have tried it with and without the ReplyAction parameter but no joy.



Hi experts,

I am getting above error when i try to restore VPC from Saved state.I done savedstate commited undo disks ,then after complete the operation , i am trying to restore the VPC from Savedstate and getting the following error.

"The specified CGI application misbehaved by not returning a complete set of HTTP headers  ".

Please help me





I'm using a class MyClass inherited from SoapHttpClientProtocol (auto-generated in my project by creating a WebReference from a .wsdl file, representing a service).

Before calling a "WebMethod" of this service, I need to custom the http header of my request. I tried overloading the GetWebRequest() method of SoapHttpClientProtocol that way :

public partial class MyClass: System.Web.Services.Protocols.SoapHttpClientProtocol{

 protected override WebRequest GetWebRequest(Uri uri) {

            HttpWebRequest request = (HttpWebRequest)base.GetWebRequest(uri);

            request.Headers.Add("MyCustomHeader", "MyCustomHeaderValue");

            return request;




I was hoping that GetWebRequest was called in the constructor of MyClass, apparently it's not.


Could someone help me ?

Can some one please help me with this issue-
I am writing exceptions to event log, and  my event log is filled with this exception message
System.Web.HttpException: Server cannot append header after HTTP headers have been sent.    
at System.Web.HttpResponse.AppendHeader(String name, String value)    
at System.Web.HttpResponse.AddHeader(String name, String value)    at 
AddFormPopUp.streamingFiles(String DocID, String FilePath, String FileExtension1)
Here is the code in streamingFiles function.
Dim binReader As New BinaryReader(File.Open((strFileNamePath), FileMode.Open, FileAccess.Read))
binReader.BaseStream.Position = 0

            Dim binFile As Byte()
            binFile = binReader.ReadBytes(Convert.ToInt32(binReader.BaseStream.Length))


            Dim fileStream As New FileStream(strFileNamePath, FileMode.Open, FileAccess.Read)

            fileStream.Read(binFile, 0, fileStream.Length)
            Response.AddHeader("content-disposition", "attachment; filename=" & Strfilenameonly)
            Response.OutputStream.Write(binFile, 0, binFile.Length)

            If Response.IsClientConnected Then Response.Flush()
            binFile = Nothing



Hi all

I've successfully configured a send port using the WCF-Custom adapter. It’s using a basicHttpBinding with transport security (HTTPS).

However, the java web service requires a basic auth header (instead of a WS-Security header). How can I accomplish this?

I’ve already tried to:
- Set credentials on the adapter -> no impact
- Adding the header using msg(HTTP. UserHttpHeaders ) -> header still not present
- Set the msg(WCF. ) property -> results in a SOAP-header

Any suggestions?



If someone on the Internet Exporer dev team monitors this forum, it would be great to get some insight into this weird behavior in IE 6, 7 and 8.

We are able to reliably recreate the following scenario:

1. Create a small HTML page that makes AJAX requests to a server (using HTTP POST)
2. Disconnect from the network and reconnect
3. Monitor the packets that IE generates after the failure

After a failed network connection, IE makes the next AJAX request but only sends the HTTP header (not the body) when doing the HTTP post. This causes all sorts of problems on the server as it is only a partial request. Google this issue with Bing and you'll find "random server errors" etc.

We know that IE (unlike most other browsers) always sends an HTTP POST as TWO TCP/IP packets. The header and body is sent separately. In the case directly after a failure, IE only sends the header.

There is a similar problem, triggered by HTTP keep-alive timeouts that are shorter than 1 minute and is documented here:


Here are the before and after failure packet captures:

Notice how the HTTP Header and Payload is sent

After a failure, notice how only the Header is sent. IE *never* sends the payload and the server eventually responds with a Timeout.







Please find cookie parser code, please use it if you like it. You are most welcome to suggest improvements because I know it has lot of scope of improvement. The underlined code may be used differently, few suggestion:

1) You can have one cookie collection object and before making any http call, use this cookie collection to attach cookies with call based on domain.

2) You may use HashTable and each domain may act as Key and value can have cookie collection object for that particular domain.

3) You may use it as it is if you are not making call domain’s other than “msn” and “live”.

private static void FindDomainCookies(WebHeaderCollection headers )


for (int i = 0; i < headers.Count; i++)


if ("Set-Cookie" == headers.Keys[i])


string rawCookie = headers[i];

if (rawCookie.Contains(","))


//regexp for Date format per RFC http://www.w3.org/Protocols/rfc2109/rfc2109 Wdy, DD-Mon-YY HH:MM:SS GMT

string dateRegExp = @"(?<day>expires=[A-Z,a-z]{3}),(?<date>\s\d{2}-[A-Z,a-z]{3}-\d{4}\s\d{2}:\d{2}:\d{2}\sgmt)";

string replaceDateExp = @"${day}${date}";

rawCookie = Regex.Replace(rawCookie, dateRegExp, replaceDateExp, RegexOptions.IgnoreCase);


string[] multipleCookies = rawCookie.Split(new char[] { ',' });

for (int j = 0; j < multipleCookies.Length; j++)


Cookie cookie = new Cookie();

string[] cookieValues = multipleCookies[j].Split(new char[] { ';' });

string [] paramNameVale;

foreach (string param in cookieValues)


paramNameVale = param.Trim().Split(new char[] { '=' });

paramNameVale[0] = paramNameVale[0].ToLower();

if (paramNameVale[0] == "domain")

cookie.Domain = param.Split(new char[] { '=' })[1];

else if (paramNameVale[0] == "expires")


string date = paramNameVale[1];

//Date format per RFC http://www.w3.org/Protocols/rfc2109/rfc2109 Wdy, DD-Mon-YY HH:MM:SS GMT 

date = Regex.Replace(date, @"(?<day>(sun
sat))", @"${day},", RegexOptions.IgnoreCase); 

cookie.Expires = Convert.ToDateTime(date);


else if (paramNameVale[0] == "path")

cookie.Path = paramNameVale[1];


cookieValues[0] = cookieValues[0].Trim();

cookie.Name = cookieValues[0].Split(new char[] { '=' })[0];

cookie.Value = cookieValues[0].Split(new char[] { '=' })[1];

if (cookie.Domain.ToLower().Contains("live"))


else if (cookie.Domain.ToLower().Contains("msn"))








Hello everyone,

What i have done is, i create a HttpWebRequest for a url. In the response headers i get "Set-Cookie" header, which contains 2 cookies. But actually it should contains 3 as i can see through Fiddler. Am i missing something?

The application is OutOfBrowser with elevation permissions on.


i am using VSTS 2010 on windows 7 64bit.

after i record test steps by using web test record and replay. i meet the following error:

This system requires the use of HTTP cookies to verify authorization information. Our system has detected that your browser has disabled HTTP cookies, or does not support them. Please refer to the Help page for more information on how to correctly configure your browser for use with this system.

i changed IE privacy setting to "always allow session cookies" but i still get the same error.

it seems that IE setting does not apply to VSTS 2010 inside Web Browser when repeat test.

so, any solution?



Hi all

I have a web application created in SP2010 on port 80 and a host header with "MyServerName" was used while creating the webapplication through Central Administration. Host Name is reflected in IIS  and a DNS entry is also made in the hosts file.

I created a site collection and my site collection works fine. I have a visual web part hosted in my site collection. My web part just checks a cookie value using HttpContext.Current.Request.Cookies[MyCookie] if not creates a cookie and on subsequent request should get the value from cookie. But Mycookie is not found in cookie collection. We are using Windows authentication while creating the creating Web application.

My webpart works fine when a webapplication does not have a host header.

Can any one help me why the cookie is not found when there is a host header for the web application.  below is sample code I am using. I tried cookie as secure and without secure too.

 Help would be higly appreciated.

protectedvoid Page_Load(object sender, EventArgs e)



            string usrName = HttpContext.Current.User.Identity.Name.ToString();

            if (HttpContext.Current.Request.Cookies[usrName.Remove(0, usrName.IndexOf("|") + 1)] != null)


                lblMessage.Text = HttpContext.Current.Request.Cookies[usrName.Remove(0, usrName.IndexOf("|") + 1)].Value;








        privatevoid WriteCoookie(string usrName)


            HttpCookie disclaimerCookie = newHttpCookie(usrName.Remove(0, usrName.IndexOf("|") + 1));


            disclaimerCookie.Value = "Accepted Terms and Condition";

            disclaimerCookie.Secure = true;

            disclaimerCookie.Expires = DateTime.MaxValue;

            disclaimerCookie.Domain = HttpContext.Current.Request.Url.Host;

            //disclaimerCookie.Path = "/";

            // disclaimerCookie.Secure = true;



            //HttpContext.Current.Response.Cookies[this.UserName].Secure = true;







I have a WSS Web Application that has been up and running without issue for several months.  The only problem that I've had is the inability to use SharePoint Designer since my site uses Forms Authentication.

Yesterday I found a Microsoft TechNet article (Configuring Multpile Authenticaion Providers for SharePoint 2007: http://blogs.msdn.com/sharepoint/archive/2006/08/16/702010.aspx) explaining how to setup multiple http://blogs.msdn.com/sharepoint/archive/2006/08/16/702010.aspx) explaining how to setup multiple authentication providers by extending a website for different zones.  I walked through the steps as best I could, extending my current web application, using the same url with a new port assignment and selecting the Extranet zone.  My intention was to have this zone setup with Windows authentication and use this url/port combination with SharePoint Designer.

The configuration failed with the error message "File Not Found".  However, when I returned to the web application list, the extended site did appear in the list.  There was no corresponding site listed in IIS and my attempts to access the site using the specified url/port failed.  As it turned out, I was also unable to access my original site at this point.

So, my next step was to remove the extended web application.  I went into Application Management, Remove..., selected my web application and the zone that I added.  I made sure to NOT select the radio button to remove the site from IIS.

This worked fine but I still cannot access my site.  When I try to navigate to the root url, I receive an HTTP 403 Forbidden error.  If I add "/default.aspx" to the url, I get an HTTP 404 Not Found Error.

Looking at IIS, the website is still setup correctly, with all the same settings as before - and exactly the same as another WSS site I have running on the same box which I can still access just fine.

Checking all of the settings in Central Administration, everything looks fine there too.  The web application is still defined, all of my settings are correct, etc.  But it seems like IIS is not passing control to SharePoint when the request is made or SharePoint is not accepting and handling the request.

I've even gone through and checked the web.config files for both WSS sites and restored a previous back-up with no success.

Ultimately, I went so far as to reinstall ASP.NET 2.0 (both 32 and 64-bit versions, then run the WSS setup again to repair the installation and finally the Configuration Wizard in the hopes that it would reconnect whatever is wrong.  It made no difference.

At this point I am stil able to access all of the other WSS sites I have hosted on the server but still cannot get into the most important one!  This is a community site with dozens of users who have been posting files and other information for months.  I cannot lose all of this.  Does anyone have any idea what I need to do or should look at.  I am desperate!!!!!


I currently have a WSE 2.0 WebServicesClientProtocol object (generated from WSDL) created which has a a valid uri as its soapAction (from the WSDL binding). I need to set the <wsa:Action> value in the <soap:Header> to something other than the binding's soapAction. I know this violates W3C protocol, but the web services are another party's and cannot be changed.

I code (where "transport" is the WebServicesClientProtocol object) :

transport.RequestSoapContext.Addressing.Action = new Microsoft.Web.Services2.Addressing.Action(reqAction);

and can see in the VS debugger that it sets the action, but when I call the method to send the web request the <wsa:Action> is reset to the same value as the soapAction in the HTTP header. I assume WSE 2.0 is doing this. Can a custom SoapOutputFilter help me to "manually override" the <wsa:Action>?

If I abandon WSE 2.0 (which I know I should!) can I still override the <wsa:Action> in this manner using WCF?


I have two ASP.NET websites hosted in two sub-domains of the same domain (e.g. subdomain1.domain.com and subdomain2.domain.com). Both websites have Forms Authentication enabled and using the same database for the membership / role / profile providers. I used the information from MSDN (http://msdn.microsoft.com/en-us/library/eb0zx8fc.aspx) to make sure that if a user authenticates in one website, the user is able to access the other website without being asked to login again.

One of the websites contains ASP.NET pages and a Silverlight application and the other website contains WCF services. I am using .NET 4.0 and Silverlight 4.

From the Silverlight app I'm making calls to the WCF services. I used the information from MSDN (http://msdn.microsoft.com/en-us/library/dd560702(VS.95).aspx) to make sure the web-services can be accessed from Silverlight. Also, I have both the clientaccesspolicy.xml and crossdomain.xml files present in the website where the web-services are.

When I browse the application using Internet Explorer 8 everything works fine. But when I browse the application using either Firefox 3.6.8 or Chrome, whenever the Silverlight app is making a call to a web-service I get the following error:

System.ServiceModel.CommunicationException was unhandled by user code
  Message=The remote server returned an error: NotFound.
       at System.ServiceModel.AsyncResult.End[TAsyncResult](IAsyncResult result)
       at System.ServiceModel.Channels.ServiceChannel.EndCall(String action, Object[] outs, IAsyncResult result)
       at System.ServiceModel.ClientBase`1.ChannelBase`1.EndInvoke(String methodName, Object[] args, IAsyncResult result)
       at MyApp.Client.DataProxy.Data.MyAppDataServiceClient.MyAppDataServiceClientChannel.EndGetAllSites(IAsyncResult result)
       at MyApp.Client.DataProxy.Data.MyAppDataServiceClient.MyApp.Client.DataProxy.Data.MyAppDataService.EndGetAllSites(IAsyncResult result)
       at MyApp.Client.DataProxy.Data.MyAppDataServiceClient.OnEndGetAllSites(IAsyncResult result)
       at System.ServiceModel.ClientBase`1.OnAsyncCallCompleted(IAsyncResult result)
  InnerException: System.Net.WebException
       Message=The remote server returned an error: NotFound.
            at System.Net.Browser.AsyncHelper.BeginOnUI(SendOrPostCallback beginMethod, Object state)
            at System.Net.Browser.BrowserHttpWebRequest.EndGetResponse(IAsyncResult asyncResult)
            at System.ServiceModel.Channels.HttpChannelFactory.HttpRequestChannel.HttpChannelAsyncRequest.CompleteGetResponse(IAsyncResult result)
       InnerException: System.Net.WebException
            Message=The remote server returned an error: NotFound.
                 at System.Net.Browser.BrowserHttpWebRequest.InternalEndGetResponse(IAsyncResult asyncResult)
                 at System.Net.Browser.BrowserHttpWebRequest.<>c__DisplayClass5.<EndGetResponse>b__4(Object sendState)
                 at System.Net.Browser.AsyncHelper.<>c__DisplayClass2.<BeginOnUI>b__0(Object sendState)

Unfortunately, one of the requirements for the Silverlight app to send the Auth cookie is to use the browser HTTP stack. It's only that by doing this, I get very generic error messages (like the one above).

If I remove the Forms Authentication settings from the website with the web-services, everything works fine.

Has anyone had problems with this? Do I have to add some specific configurations of logic for Firefox and/or Chrome browsers?

Any help is appreciated. Thank you.


I am using <wl:signin> with Silverlight and have no problem getting the cid back from the AccessToken.

When I run the same program in Azure I do not get the AccessToken.  I have checked the traffic with Fiddler and everything is the same until the OAuthCallback GET which has a cookie when it does not run inside Azure. When it runs inside Azure there is no cookie and no AccessToken is returned.

What happened to the cookie?

Hello All,

This problem has me stumped.

We are trying out CRM 4.0 and did a fresh install. We have setup CRM on two machines, one is the web server and the other acts as the SQL server. For the purposes of testing we setup AD on the SQL server.

During the installation we tried to use the domain administrator account to run the app pool but we kept getting a Verify Domain user account SPN for the Microsoft Dynamics CRM ASP.NET application pool. We looked around the forums and found a lot of posts talking about setting the SPN for the installation account so we ran the setspn tool for the <Domain Name>\administrator account (The account we were trying to use) but that did not work

So we went and set the app pool to use the Network account and the installation completed sucessfully. We used a typical setup during the installtion. So I guess all the other settings are standard.

We opened the CRM web page on the server and everything looks good. But a problem arises when we try to access the URL from a client machine (The client was joined to the CRM domain and added to CRM as a system admin). we get the following error
HTTP Error 401.1 - Unauthorized: Access is denied

The event logs do not seem to have any error reports regarding CRM.

But the wierd thing is that if we use the servers IP address we can access the server. We just can't get it. How are we able
to access the server using the ip address with out any problems
but when we try to use the computer name/ FQDN we get the above errors

Any Help would be greatly appreciated



Hi all,

I have a similar problem.

I can access to any CRM pages using IE, no problem. But if i try to access to help pages, i am receiving the error message below. (only at the help pages)

Do you have any idea?

HTTP Error 401.2 - Unauthorized: Access is denied


I have a WCF client that is trying to consume a java webmethods service. 

What they need me to send for http headers is:
POST /StockQuote HTTP/1.1
Host: www.stockquoteserver.com
Content-Type: text/xml; charset="utf-8"
Content-Length: nnnn
SOAPAction: "Some-URI.."

but what I am sending for http headers is:
POST /StockQuote HTTP/1.1
Host: www.stockquoteserver.com
Content-Type: text/xml; charset="utf-8"
Content-Length: nnnn

How can I add "SoapAction"?

(btw, I tried createing a customheader class, derived from messageheader, and adding that to the header collection in the request, but that ends up in the soapheaders not the http headers)






<< Previous      Next >>

Microsoft   |   Windows   |   Visual Studio   |   Sharepoint   |   Azure