Why does Windows application is requiring .Net 3.5 framework

Tag: windows Author: masterdr Date: 2009-08-30

I have the Target Framework set to 2.0 on my windows application, yet when I try to install my app on the server, after publishing it through VS 2008, it is trying to install .Net 3.5 on the server.

I do not want to install 3.5 on my server.

When I copy the files from my local /bin/debug/ to the server and double click on the exe, nothing happens. On my local machine, my app runs.

How can I make this app run on the server without it needing the .Net 3.5 framework?

For an application deployed to clients, there are valid business cases for not requiring .NET 3.5. For a server, it's less clear. What's your concern about installing .NET 3.5 (or better yet, SP1) on the server? Any existing 2.0 apps will continue to run.

Best Answer

Do any of your dependencies require .NET 3.5? Do you have anything in any config files which might require .NET 3.5?

I suggest you take a copy of what you've got for safekeeping, and then cut it down to the very smallest app which demonstrates the problem. In fact, you might want to start from scratch with a "no-op" app and see whether that has the same behaviour.

comments:

Good point, but no. All dependencies target 2.0.

Other Answer1

Check unused references, perhaps? Are you actually getting an error about the 3.5 framework?

comments:

this bit me ...

Other Answer2

Try building the application in release mode and deploy it to the server. You will need to grab the application from the /bin/release folder instead of the /bin/debug folder.

Also, check the target framework under the application section of the project properties.

Other Answer3

If you're using Visual Studio to build your setup project, open the setup project's properties and look through the settings. One setting says which .Net version will be demanded by the installer package. You have to set that; it doesn't inherit from known properties of your other projects.