![]() 03333 445950 03333 calls are same cost
| Making It Happen This is a guide on problem solving for Sage 50 installations to get more idea of the type of problem. The first step is to narrow things down
If an individual workstation is performing badly, could be an issue with the Anti-Virus software, Network Speed or it could be just a very old PC.
If this is a new install then it could be that something went wrong during install, performing a clean reinstall solves most of these problems.
Windows workstations work fine as a server, however, there is a user limit. There are problems where there are mixed operating systems due to SMB2
Most servers work fine, Windows servers tend not to perform as well as others, mainly because they do other things as well. Windows SBS servers are the worst performers and to be avoided. As of V2015 (V21) Sage 50 now requires a Windows Server. Windows Terminal Server is the best performing way to run Sage 50, as it avoid all the issue of network bandwidth, however, it is also an expensive solution.
So you have a problem here are some test you can do to help find the cause
Relocate the data from the server to the local hard drive, if things go great, your problem is networking. You may want to check your antivirus settings after moving the data. This is how I would do it. My data is currently on S:\Sage I copy the Sage folder from Drive S to Drive C I exclude C:\Sage from by real time virus scanning. I change the path in the "company" file from S:\Sage to C:\Sage Our guide to finding your data location includes details on finding and editing the company file.
There are 2 ways to tell sage how to access the data on a server, you can either Map a drive or you can use a UNC Path On my Server called "fileserver" I have a file share called sageshare and in there is a folder called Sage that contains the data I could map the sageshare to drive S: and tell Sage to look in S:\Sage for the data (this was the recommended method in the past) sage now recommend using a UNC path eg \\fileserver\sageshare\Sage However, we have found that it can make a huge difference to performance to use Mapped drives rather than UNC, so it is always worth testing both and seeing if one works better than the other. Running a report such as a transactional TB is a good test. Make sure you change you antivirus settings when you change your data path.
The easiest thing if you can not connect to the server is to ping it from the command line First do PING 192.168.1.100 Substituting the ipaddress above with the actual IP address of the server (Type IPCONFIG on the server command line to find it) If you get packets lost or variable TTL this suggests network problems If you get no reply then you have no network connectivity. Second do PIN SERVERNAME This checks that the servername is being resolved properly Want a ping to run longer use PING SERVERNAME -t to make it run until you do CRTL-C
Copying a large file over the network is a great test of performance, but is not everything. The speed at which the name of the server can be resolved to its actual ip address, the speed at which files can open and close, the type of caching and so on all affect performance, however, if a large file takes a long time to copy, and if it causes packet loss and higher ttl on a ping then it indicate network congestion.
DNS is one way that computers turn names (which humans like) into numbers (that computers use) if DNS is not correctly configured, it can have a devastating effect on performance. Domains (Not Workgroups) use DNS, the server, all workstations and should have the serve as the DNS server and the Server should have the ISP's DNS server as its DNS server. The Server should act as the DCHP server. The Broadband router should be disabled from acting as a DCHP Server.
Cat 5e is required for most Gigabit Ethernet installations. contrary to popular understanding, this is not a type of Cable. It is a standard, a standard which the calbe must comply with, but so must the network ports, and critically the installation of the cable. You cant, bend the cable sharply, you have to fix the cable to the structure of the building and so on. Now Gigabit networking is pretty fault tolerant, but, in a high pressure network environment, where there is a lot of data, this can result in data throughput being delayed and in control protocol issues, in Sage we can see files locked, files left open, and slow performance. There are 2 types of test equipment used to check cabling, a "continuity" tester which just checks that the cables are electrically connected and an analyser that tests, the signal to noise ratio and a whole lot more. The former cost under £10 and the later around £1000, what they do is quite different.
The infrastructure is only as good as it's weakest point, if you have a 10Mbit card in your server, nothing will go faster than 10Mbit
Hardware is not what it seems, that box with the flashing lights and a fistfull of cables is not an inert box, it is a computer system in its own right, it will crash if it get hit by a power blip, it will crash if a stray alpha particle form the sun happens to hit the right thin in the right place. Rebooting these boxes, buy unplugging the power waiting for 15 seconds and plugging them back in again can solve a myriad of strange problems.
Sage is network hungry, forget running over wireless, it will not work well
Sage is network hungry, forget running over main ethernet, it will not work well
You are having a laugh, Sage will not work even very slowly over a VPN
This is the best way to run sage, no network congestion, but make sure the server is properly specified for the job. |
|