Board Problems

On or off topic discussions. Whatever is on you mind...
Post Reply
User avatar
Lord Hypno
Hypnotist
Posts: 647
Joined: Tue May 10, 2005 1:05 am
Location: Plant City Florida
Contact:

Board Problems

Post by Lord Hypno » Fri Nov 18, 2011 6:26 pm

so we are felling growing pains servage's SQL server only let's us do 150000 request's a day, when we go over it' shuts down, the people over at phpbb say it's a common problem you just need to add a user or 2 the the database. because the request limit is per user how ever servage does not allow more then one user. so that fix wont work BUT a my workaround has been to backup the database every 20 hours or so, and that also reset's the limit,

SO what i need is Someone who knows how to make a Chron Job backup the SQL database, because it's a bit over head,
___________________
_____________________

Fear is The Mind Killer
User avatar
tinyspider
Guitar Hero
Posts: 295
Joined: Mon Aug 22, 2005 9:02 am
Location: South America
Contact:

Re: Board Problems

Post by tinyspider » Sat Nov 19, 2011 2:16 am

I'm not really expert at cron jobs, but googling around a little bit came up with this:

Code: Select all

#!/usr/bin/perl
use warnings;
use strict;

# Set connection values
my $DATABASE   = 'database_name';
my $DUMP_DIR   = '/path/to/backup_folder/';
my $MYSQLADMIN = '/usr/bin/mysqldump';
my ($DUMP_FILE,$DUMP_SCRIPT);
######################################

  $DUMP_FILE = $DUMP_DIR . $DATABASE . ".sql";
  if(-e "$DUMP_FILE") { unlink("$DUMP_FILE"); }
  $DUMP_SCRIPT = $MYSQLADMIN . " --defaults-file=/home/<username>/.my.cnf" . ' --opt' . " $DATABASE > $DUMP_FILE";
  system("$DUMP_SCRIPT");

  exit(0);
In the first part of the script you need to specify the database name, path where you want to save the backup (linux full path). I'm presuming Servage uses cPanel or a similar hosting control panel, which means the perl path. "<username>" should also be replaced with the user for this host, I'm assuming it should either be "basictek" or "basictek.net" (without the quotes). Let me know if this helps, maybe Servage has some sample cron jobs on the help section.
User avatar
Lord Hypno
Hypnotist
Posts: 647
Joined: Tue May 10, 2005 1:05 am
Location: Plant City Florida
Contact:

Re: Board Problems

Post by Lord Hypno » Sat Nov 19, 2011 5:40 am

araenae wrote: I'm presuming Servage uses cPanel or a similar hosting control panel,------- maybe Servage has some sample cron jobs on the help section.
no they have there own custom BS and i've tried a few like that one but i'm not sure how to make the php file. i even have downloaded a few and tried to patch them to what i need but it's fails. here is what it look's like " this is all i have to work with and servage's help is .... well less that helpful "
Clipboard01.jpg
Clipboard02.jpg
___________________
_____________________

Fear is The Mind Killer
User avatar
tinyspider
Guitar Hero
Posts: 295
Joined: Mon Aug 22, 2005 9:02 am
Location: South America
Contact:

Re: Board Problems

Post by tinyspider » Sun Nov 20, 2011 1:08 am

Gee that panel sucks lol. The sample I posted is a cron job done in Perl, but the one you are trying to add is done on php. If you already uploaded a php script that does the backup, make sure you're typing the full path to the script. I browsed Servage wiki but there's no article about the full path to your root folder, so you might want to upload a php file to the www folder (name it info.php or something like that) with this code:

Code: Select all

phpinfo();
Then open it in your browser to see php information, that page should show you the full path to your root folder which you can use to infer the path to your script.
If you catch me on yahoo I might be able to help you better, I'm usually on late at night.
DarkNS
50+ FIMS Regular
Posts: 78
Joined: Wed Feb 22, 2006 6:34 pm

Re: Board Problems

Post by DarkNS » Sun Nov 20, 2011 10:12 am

The problem is that there is no "backup" standar in MySQL, you can do a dump of the DB, but that's it.

My point is that the "backup" action from the server perspective could only be doing it from the administration panel.

You can test this if you run a dump of the DB like this:

Code: Select all

mysqldump -u USER -pPASSWORD -h HOST DBNAME > mybackup.sql
The real (but more expensive) solution is to implement various mechanisms to reduce the ammount of queries. Some of them are:
- Page cache
- Queries optimization (As in... make a more complex query instead of a few simple ones)

But all this depends more on PHPBB than you.

If I can help in some way, let me know.
Wishmaster
100+ FIMS Warrior
Posts: 106
Joined: Wed May 18, 2005 6:14 pm

Re: Board Problems

Post by Wishmaster » Tue Nov 29, 2011 8:08 pm

I would guess a large number of visitors are probably robots/crawlers of some kind.

I tried to access robots.txt - the file that they are supposed to read and obey when they visit a site, but I could not find one here. I tried:

http://lm.basictek.net/robots.txt
robots.txt
http://basictek.net/robots.txt

What you want is to create a robots.txt file and place it in the root of the domain (so probably lm.basictek.net or basictek.net).

Here is any example robots.txt file for a site I ran years ago, we used a phpbb2 forum, so a similar config may work here. You basically define what the robot can crawl (or what it can't crawl).

We had some private forums, so we explicitly denied those, and most importantly we denied access to some of the "database heavy" pages, like member list. The '/' in the list below is relative to the root, so if you place the robots.txt at lm.basictek.net/robots.txt, then you probably don't need to have the "forum" path, because the forum is located at the root of the domain lm.basictek.net

Now whether this will work at lm.basictek.net, or whether it needs to be at basictek.net, I do not know. But I highly recommend you give this a go, and I would also make the other forums visible again - it is annoying to have to login to see things =\, and if the crawlers have cached the link to a specific forum, they will still crawl it because the direct links to the forums still work (even though they are invisible)

here is the 'robots.txt' file

Code: Select all

User-agent: *

Disallow: /forum/admin/
Disallow: /forum/images/
Disallow: /forum/includes/
Disallow: /forum/language/
Disallow: /forum/templates/
Disallow: /forum/common.php
Disallow: /forum/config.php
Disallow: /forum/groupcp.php
Disallow: /forum/memberlist.php
Disallow: /forum/modcp.php
Disallow: /forum/posting.php
Disallow: /forum/profile.php
Disallow: /forum/privmsg.php
Disallow: /forum/viewonline.php
Disallow: /forum/search.php
Disallow: /forum/faq.php
Disallow: /forum/viewforum.php?f=2
Disallow: /forum/viewforum.php?f=5

Post Reply