254 shaares
56 results
tagged
dnn
I am totally new to DNN development, I made a module using Christoc Modules, and all my logic is in the view.ascx page, I want to add a new view that basically, would be a button that when the user clicks on it the user would be redirected to the view.ascx page, I have looked everywhere to find tutorial or example that is well explained to do this step by step but I didn't find
I want to set up a simple listing of cards (with a heading, a picture, some text, and a link) and when the user clicks on the link of a card, it brings them to a full "details" view that has a friendly URL. Very similar to a news module but considerably simpler.
So I would have https://mywebsite.com/careers and then a listing. When a user clicks on a career, they would be taken to https://mywebsite.com/careers/dairy-farmer
The URL would be the title of the career. On the details page, there would be a lot more content as well, that the content editor could manage.
Is this video still relevant? https://youtu.be/RtmOYvaeJpo
Or is there a newer, simpler way of doing things? Also, is this something I can set up in the Content module as opposed to the App module?
So I would have https://mywebsite.com/careers and then a listing. When a user clicks on a career, they would be taken to https://mywebsite.com/careers/dairy-farmer
The URL would be the title of the career. On the details page, there would be a lot more content as well, that the content editor could manage.
Is this video still relevant? https://youtu.be/RtmOYvaeJpo
Or is there a newer, simpler way of doing things? Also, is this something I can set up in the Content module as opposed to the App module?
Notes:
Check the DNN Scheduler to see if there are any active jobs that are taking longer than they should. For example, if the Site Crawler scheduler is constantly running then you should check the files in the Portals folders to make sure all of the files located in the Portals folder should actually be there. The crawler is rebuilding the index and if you have a lot of files it could take hours to complete. If the files should be there, disable the crawler scheduler and then run during your slowest time of the day (1:00 AM?). I ran into this problem on a server that had hundreds of thousands of documents in the portals folder. Ended up solving it by running the crawler between 1:00 AM and 5:00 AM for a few days until it indexed all of the files. Once the files are indexed it will only have to index changed and new files; so it should just be a burden the first time it runs.
Another possible cause are exceptions. If your site is throwing a large amount of exceptions it will slow down your site. The handling of the exceptions and then the logging of them (to the DNN EventLog table in the database and the Log4Net files) can be brutal if your site is constantly throwing exceptions. If your site is also running in DEBUG mode the performance hit is multiplied by at least 30 times due to .Net collection all of the additional information about the exception while running in debug mode. That will be brutal to your sites performance.
Check the server logs to see how often IIS is recycling the application pool for your DNN site. If it's occurring often then that is also a sign of a large amount of exceptions being thrown if you are using the default IIS application pool settings. By default, IIS will recycle your application pool if too many exceptions are thrown within a short period of time. If you also have the option set to bring up a new instance of your site and run it side by side before IIS terminates the existing instance while your site is throwing exceptions that can cause a bottleneck and will cripple performance. For this situation, I usually disable IIS from recycling the application pool if too many exceptions are thrown within a short period of time. That may not be the best option for you but if you are on top of the exceptions being thrown on the site then you can disable that and let IIS run instances side by side after an app recycle (this is nice to have when you recycle during active periods so that all existing traffic completes with the old instance and all new traffic is sent to the new instance. Once all traffic is hitting the new instance of your site IIS will terminate the older instance.)
If none of the above help, run SQL Profiler on your database to see if there is any extreme database activities going on. Also check for any db locks.
There are a lot of possible causes that can slow down DNN. The best way to find out what is going on is to run a profiler on the server (RedGate Ants profiler or Telerik (Progress) Just Trace).
Check the DNN Scheduler to see if there are any active jobs that are taking longer than they should. For example, if the Site Crawler scheduler is constantly running then you should check the files in the Portals folders to make sure all of the files located in the Portals folder should actually be there. The crawler is rebuilding the index and if you have a lot of files it could take hours to complete. If the files should be there, disable the crawler scheduler and then run during your slowest time of the day (1:00 AM?). I ran into this problem on a server that had hundreds of thousands of documents in the portals folder. Ended up solving it by running the crawler between 1:00 AM and 5:00 AM for a few days until it indexed all of the files. Once the files are indexed it will only have to index changed and new files; so it should just be a burden the first time it runs.
Another possible cause are exceptions. If your site is throwing a large amount of exceptions it will slow down your site. The handling of the exceptions and then the logging of them (to the DNN EventLog table in the database and the Log4Net files) can be brutal if your site is constantly throwing exceptions. If your site is also running in DEBUG mode the performance hit is multiplied by at least 30 times due to .Net collection all of the additional information about the exception while running in debug mode. That will be brutal to your sites performance.
Check the server logs to see how often IIS is recycling the application pool for your DNN site. If it's occurring often then that is also a sign of a large amount of exceptions being thrown if you are using the default IIS application pool settings. By default, IIS will recycle your application pool if too many exceptions are thrown within a short period of time. If you also have the option set to bring up a new instance of your site and run it side by side before IIS terminates the existing instance while your site is throwing exceptions that can cause a bottleneck and will cripple performance. For this situation, I usually disable IIS from recycling the application pool if too many exceptions are thrown within a short period of time. That may not be the best option for you but if you are on top of the exceptions being thrown on the site then you can disable that and let IIS run instances side by side after an app recycle (this is nice to have when you recycle during active periods so that all existing traffic completes with the old instance and all new traffic is sent to the new instance. Once all traffic is hitting the new instance of your site IIS will terminate the older instance.)
If none of the above help, run SQL Profiler on your database to see if there is any extreme database activities going on. Also check for any db locks.
There are a lot of possible causes that can slow down DNN. The best way to find out what is going on is to run a profiler on the server (RedGate Ants profiler or Telerik (Progress) Just Trace).
What is the difference between DNN module control types Edit and View
Turn-key white papers are how we share some of our best advice to the public. We take pride in ensuring that best practices are followed by all, even those that are not our current customers.
After upgrading DNN it is possible that your prior performance configurations are lost. It is important to regularly verify your configuration.
ASP.NET Ajax allows developers to add Ajax functionality to quickly introduce Ajax features without extra development, however, usage of this within DNN is met with some challenges.
Many extension developers rely on the internal DotNetNuke Text Editor to provide rich-text editing experiences to users, however, Visual Studio may not always make this process straightforward.
Often times you might need to confirm actions with users, by creating a re-usable confirmation in the skin of your site, you can control this look and feel easily.
DotNetNuke provides its own internal data access strategy for accessing SQL Server, this process works well for most standard situations. However, complex items such as the ability to call using Table-Valued Parameters can be less straightforward to implement.
Working with custom solutions it is commonplace to need multiple extensions, it is possible with small modifications to package those extensions all at once.
Over the past 6 months, I have been pushing to establish a set of NuGet packages for DotNetNuke extension development, these have finally been released!
Developers will often need to add custom functionality that ties into, or consumes data from, existing DNN tables. There are many ways that one can do this, and this post explores one of the best ways to do this without impacting DNN.
DNN provides methods for encrypting and decrypting parameter information, however, there are a few gotchas that can impact the usability of this feature.
I love sharing stories of great experiences with third-party vendors, as it doesn't happen as often as I would like. When a recent performance issue was identified in DNN using 51 Degrees software, they jumped in and helped greatly.
Module developers have many choices as they create their extensions. One of which is around the level of emphasis placed on performance, a few simple things can set your module apart from others with these performance tweaks.
Installing multiple DotNetNuke extensions at the same time can be tedious and result in unexpected downtime. Thankfully there is an easy way to install multiple extensions at once.
Now that we have a good structure in place, how do we deploy to different environments and test our DotNetNuke solutions?
In the final installation of my Enterprise Extension Development blog series we look at the actual project file structure and focus on reusability.