Pythonã§ããŒã¿ä¿è·ãç¿åŸãã·ã³ãã«ãªãã¡ã€ã«ã³ããŒããé«åºŠãªããŒã¿ããŒã¹ãã¯ã©ãŠããœãªã¥ãŒã·ã§ã³ãŸã§ãå æ¬çãªããã¯ã¢ããæŠç¥ããäžçäžã®éçºè åãã®å®çšçãªã³ãŒãäŸãšå ±ã«æ¢æ±ããŸãã
Pythonããã¯ã¢ããæŠç¥ïŒããŒã¿ä¿è·å®è£ ã®ããã®å æ¬çã¬ã€ã
ããŒã¿é§åå瀟äŒã«ãããŠãã¢ããªã±ãŒã·ã§ã³ãåãããæŽå¯ãçã¿åºããéåçãªç¥èãä¿åããããããšãã€ãã¯ãæã貎éãªè³ç£ã®äžã€ã§ããããããããŒã¿ã¯èããã®ã§ããããŒããŠã§ã¢ã®æ éããœãããŠã§ã¢ã®ãã°ããµã€ããŒè åšããããŠäººçºçãã¹ã¯é¿ããããŸãããäžã€ã®äºæãã¬åºæ¥äºããäœå¹Žãã®åªåãæ°Žã®æ³¡ã«ãããŠãŒã¶ãŒã®ä¿¡é Œãæãªããããžãã¹ã«å埩äžå¯èœãªæå®³ãäžããå¯èœæ§ããããŸãããã®ãããªç¶æ³ã«ãããŠãå ç¢ãªããã¯ã¢ããæŠç¥ã¯ãITã®éåã§ã¯ãªããäºæ¥ç¶ç¶æ§ãšå埩åã®åºæ¬çãªæ±ãšãªãã®ã§ãã
éçºè ãã·ã¹ãã 管çè ã«ãšã£ãŠãPythonã¯ãããããç°å¢ã«åãããã«ã¹ã¿ã ã®èªåããã¯ã¢ãããœãªã¥ãŒã·ã§ã³ãæ§ç¯ããããã®ã匷åã§æè»ãã€ã¢ã¯ã»ã¹ããããããŒã«ããããæäŸããŸããæšæºã©ã€ãã©ãªãšãµãŒãããŒãã£ã©ã€ãã©ãªã®è±å¯ãªãšã³ã·ã¹ãã ã«ãããã·ã³ãã«ãªãã¡ã€ã«ã³ããŒãããè€éãªæå·åãããããŒãžã§ã³ç®¡çãããã¯ã©ãŠãã¹ãã¬ãŒãžãžã®ããã¯ã¢ãããŸã§ãããããåŠçãå¯èœã§ãããã®ã¬ã€ãã§ã¯ãPythonã䜿çšããŠå¹æçãªããŒã¿ä¿è·ãå®è£ ããããã®æŠç¥ãããŒã«ããã¹ããã©ã¯ãã£ã¹ããäžçã®éçºè ãDevOpsãšã³ãžãã¢ãITãããã§ãã·ã§ãã«åãã«è§£èª¬ããŸãã
3-2-1ã«ãŒã«ïŒããã¯ã¢ããæŠç¥ã®èŠç³
ã³ãŒãã«å ¥ãåã«ãããããæ¬æ Œçãªããã¯ã¢ããèšç»ã®åºæ¬ååã§ãã3-2-1ã«ãŒã«ãçè§£ããããšãäžå¯æ¬ ã§ããããã¯ãããŒã¿å埩åã確ä¿ããããã®ã·ã³ãã«ãªãã¬ãŒã ã¯ãŒã¯ãæäŸãããäžççã«èªèãããå®çžŸã®ãããã¹ããã©ã¯ãã£ã¹ã§ãã
- ããŒã¿ã®3ã€ã®ã³ããŒïŒ ããã«ã¯ããã©ã€ããªã®éçšããŒã¿ãšãå°ãªããšã2ã€ã®ããã¯ã¢ãããå«ãŸããŸããã³ããŒãå€ããã°å€ãã»ã©ãããŒã¿ãå®å šã«å€±ããªã¹ã¯ã¯äœããªããŸãã
- 2çš®é¡ã®ç°ãªãã¹ãã¬ãŒãžã¡ãã£ã¢ïŒ ãã¹ãŠã®ã³ããŒãåãçš®é¡ã®ããã€ã¹ã«ä¿åããªãã§ãã ãããããšãã°ããã©ã€ããªããŒã¿ããµãŒããŒã®å éšSSDã«ã1ã€ã®ããã¯ã¢ãããå€ä»ãããŒããã©ã€ãïŒãŸãã¯ãããã¯ãŒã¯ã¢ã¿ãããã¹ãã¬ãŒãž - NASïŒã«ããã1ã€ãã¯ã©ãŠãã¹ãã¬ãŒãžã®ãããªç°ãªãã¡ãã£ã¢ã«ä¿åã§ããŸããããã«ãããç¹å®ã®ã¹ãã¬ãŒãžã®çš®é¡ã«èµ·å ããé害ããä¿è·ãããŸãã
- 1ã€ã®ãªããµã€ãã³ããŒïŒ ããã¯çœå®³åŸ©æ§ã«ãšã£ãŠæãéèŠãªéšåã§ããç«çœã措氎ãçé£ãäž»èŠãªå Žæã«åœ±é¿ãäžããå Žåã§ãããªããµã€ãããã¯ã¢ãããããã°ããŒã¿ã¯å®å šã§ãããã®ãªããµã€ãã®å Žæã¯ãå¥ã®éœåžã®ç©ççãªãªãã£ã¹ã§ããå Žåãããã°ã仿¥ã§ã¯ããäžè¬çã«ãå®å šãªã¯ã©ãŠãã¹ãã¬ãŒãžãããã€ããŒã§ããå ŽåããããŸãã
æ§ã ãªPythonãã¯ããã¯ãæ¢æ±ããéãã3-2-1ã«ãŒã«ã念é ã«çœ®ããŠãã ãããç§ãã¡ã®ç®æšã¯ããã®æŠç¥ã广çãã€èªåçã«å®è£ ããã®ã«åœ¹ç«ã€ã¹ã¯ãªãããæ§ç¯ããããšã§ãã
Pythonã«ããåºæ¬çãªããŒã«ã«ããã¯ã¢ããæŠç¥
ããããããã¯ã¢ããæŠç¥ã«ãããæåã®ã¹ãããã¯ãããŒã«ã«ã³ããŒã確ä¿ããããšã§ããPythonã®æšæºã©ã€ãã©ãªã¯ããã¡ã€ã«ããã³ãã£ã¬ã¯ããªæäœãåŠçããããã®åŒ·åãªããŒã«ãæäŸããŠããããããç°¡åãªã¿ã¹ã¯ã«ããŸãã
`shutil` ã䜿çšããã·ã³ãã«ãªãã¡ã€ã«ããã³ãã£ã¬ã¯ããªã®ã³ããŒ
`shutil` (ã·ã§ã«ãŠãŒãã£ãªãã£) ã¢ãžã¥ãŒã«ã¯ãé«ã¬ãã«ãªãã¡ã€ã«æäœã«æé©ã§ããæåã§ã®ãã¡ã€ã«èªã¿æžãã®è€éããæœè±¡åããåäžã®ã³ãã³ãã§ãã¡ã€ã«ããã£ã¬ã¯ããªããªãŒå šäœãã³ããŒã§ããŸãã
ãŠãŒã¹ã±ãŒã¹ïŒ ã¢ããªã±ãŒã·ã§ã³èšå®ãã£ã¬ã¯ããªããŠãŒã¶ãŒã¢ããããŒãã³ã³ãã³ããã©ã«ãããŸãã¯å°èŠæš¡ãããžã§ã¯ãã®ãœãŒã¹ã³ãŒãã®ããã¯ã¢ããã
åäžãã¡ã€ã«ã®ã³ããŒïŒ `shutil.copy(source, destination)` ã¯ãã¡ã€ã«ãšãã®ããŒããã·ã§ã³ãã³ããŒããŸãã
ãã£ã¬ã¯ããªããªãŒå šäœã®ã³ããŒïŒ `shutil.copytree(source, destination)` ã¯ãã£ã¬ã¯ããªãšãã®äžã®ãã¹ãŠãååž°çã«ã³ããŒããŸãã
å®è·µäŸïŒãããžã§ã¯ããã©ã«ãã®ããã¯ã¢ãã
import shutil import os import datetime source_dir = '/path/to/your/project' dest_dir_base = '/mnt/backup_drive/projects/' # Create a timestamp for a unique backup folder name timestamp = datetime.datetime.now().strftime('%Y-%m-%d_%H-%M-%S') dest_dir = os.path.join(dest_dir_base, f'project_backup_{timestamp}') try: shutil.copytree(source_dir, dest_dir) print(f"Successfully backed up '{source_dir}' to '{dest_dir}'") except FileExistsError: print(f"Error: Destination directory '{dest_dir}' already exists.") except Exception as e: print(f"An error occurred: {e}")
å§çž®ã¢ãŒã«ã€ãã®äœæ
ãã£ã¬ã¯ããªã®ã³ããŒã¯åªããŠããŸããã倧éã®ãã¡ã€ã«ã«ã€ãªããå¯èœæ§ããããŸããããã¯ã¢ãããåäžã®ã¢ãŒã«ã€ãïŒ`.zip`ã`.tar.gz`ãã¡ã€ã«ãªã©ïŒã«å§çž®ããããšã«ã¯ãããã€ãã®å©ç¹ããããŸããããã«ãããã¹ãã¬ãŒãžã¹ããŒã¹ãå€§å¹ ã«ç¯çŽããããããã¯ãŒã¯è»¢éæéãççž®ããããã¹ãŠãåäžã®ç®¡çãããããã¡ã€ã«ã«ãŸãšããããŸãã
`shutil.make_archive()` 颿°ã䜿çšãããšããããéåžžã«ç°¡åã«ãªããŸãã
å®è·µäŸïŒå§çž®ããã¯ã¢ããã¢ãŒã«ã€ãã®äœæ
import shutil import datetime import os source_dir = '/var/www/my_application' archive_dest_base = '/var/backups/application/' # Ensure the destination directory exists os.makedirs(archive_dest_base, exist_ok=True) # Create a timestamped filename timestamp = datetime.datetime.now().strftime('%Y-%m-%d') archive_name = os.path.join(archive_dest_base, f'my_app_backup_{timestamp}') try: # Create a gzipped tar archive (.tar.gz) archive_path = shutil.make_archive(archive_name, 'gztar', source_dir) print(f"Successfully created archive: {archive_path}") except Exception as e: print(f"An error occurred during archival: {e}")
äžçŽæŠç¥ïŒåæãšãªã¢ãŒãããã¯ã¢ãã
ããŒã«ã«ããã¯ã¢ããã¯è¯ãåºçºç¹ã§ããã3-2-1ã«ãŒã«ãæºããããã«ã¯ããªããµã€ãã«ã³ããŒã眮ãå¿ èŠããããŸããããã«ã¯ããããã¯ãŒã¯çµç±ã§ã®ããŒã¿è»¢éã䌎ããå¹çæ§ãšã»ãã¥ãªãã£ãæãéèŠã«ãªããŸãã
`rsync` ã䜿çšããå¢åããã¯ã¢ããã®å
å€§èŠæš¡ãªãã£ã¬ã¯ããªãé »ç¹ãªããã¯ã¢ããã®å Žåãæ¯åãã¹ãŠã®ããŒã¿ãåã³ããŒããã®ã¯éå¹ççã§ããããã§ `rsync` ãåšåãçºæ®ããŸããããã¯ããã«ã¿è»¢éã¢ã«ãŽãªãºã ã§æåãªå€å žçãªã³ãã³ãã©ã€ã³ãŠãŒãã£ãªãã£ã§ãããå®éã«å€æŽããããã¡ã€ã«ã®éšåã®ã¿ãã³ããŒããããšãæå³ããŸããããã«ããã転éæéãšãããã¯ãŒã¯åž¯åå¹ ã®äœ¿çšéãåçã«åæžãããŸãã
Pythonå ãã `subprocess` ã¢ãžã¥ãŒã«ã䜿çšã㊠`rsync` ãã³ãã³ãã©ã€ã³ããã»ã¹ãšããŠå®è¡ããããšã§ããã®åãæŽ»çšã§ããŸãã
å®è·µäŸïŒPythonã§`rsync`ãåŒã³åºããŠãªã¢ãŒãããã¯ã¢ãããè¡ã
import subprocess source_dir = '/path/to/local/data/' remote_user = 'backupuser' remote_host = 'backup.server.com' remote_dir = '/home/backupuser/backups/data/' # The rsync command. -a is for archive mode, -v for verbose, -z for compression. # The trailing slash on source_dir is important for rsync's behavior. command = [ 'rsync', '-avz', '--delete', # Deletes files on the destination if they're removed from the source source_dir, f'{remote_user}@{remote_host}:{remote_dir}' ] try: print(f"Starting rsync backup to {remote_host}...") # Using check=True will raise CalledProcessError if rsync returns a non-zero exit code result = subprocess.run(command, check=True, capture_output=True, text=True) print("Rsync backup completed successfully.") print("STDOUT:", result.stdout) except subprocess.CalledProcessError as e: print("Rsync backup failed.") print("Return Code:", e.returncode) print("STDERR:", e.stderr) except Exception as e: print(f"An unexpected error occurred: {e}")
`paramiko` ã䜿çšããçŽç²ãªPython SFTP転é
å€éšã®ã³ãã³ãã©ã€ã³ããŒã«ã«äŸåããªãçŽç²ãªPythonãœãªã¥ãŒã·ã§ã³ã奜ãå Žåã¯ã`paramiko` ã©ã€ãã©ãªãåªããéžæè¢ã§ããããã¯ãSFTP (SSHãã¡ã€ã«è»¢éãããã³ã«) ãå«ãSSHv2ãããã³ã«ã®å®å šãªå®è£ ãæäŸããã»ãã¥ã¢ã§ããã°ã©ã ã«ãããã¡ã€ã«è»¢éãå¯èœã«ããŸãã
ãŸãããããã€ã³ã¹ããŒã«ããå¿ èŠããããŸãïŒ`pip install paramiko`
å®è·µäŸïŒ`paramiko` ã䜿çšããŠSFTPçµç±ã§ããã¯ã¢ããã¢ãŒã«ã€ããã¢ããããŒããã
import paramiko import os host = 'backup.server.com' port = 22 username = 'backupuser' # For production, always use SSH key authentication instead of passwords! # password = 'your_password' private_key_path = '/home/user/.ssh/id_rsa' local_archive_path = '/var/backups/application/my_app_backup_2023-10-27.tar.gz' remote_path = f'/home/backupuser/archives/{os.path.basename(local_archive_path)}' try: # Load private key key = paramiko.RSAKey.from_private_key_file(private_key_path) # Establish SSH client connection with paramiko.SSHClient() as ssh_client: ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy()) # ssh_client.connect(hostname=host, port=port, username=username, password=password) ssh_client.connect(hostname=host, port=port, username=username, pkey=key) # Open SFTP session with ssh_client.open_sftp() as sftp_client: print(f"Uploading {local_archive_path} to {remote_path}...") sftp_client.put(local_archive_path, remote_path) print("Upload complete.") except Exception as e: print(f"An error occurred during SFTP transfer: {e}")
é«åºŠãªæŠç¥ïŒã¯ã©ãŠãã¹ãã¬ãŒãžé£æº
ã¯ã©ãŠãã¹ãã¬ãŒãžã¯ããªããµã€ãããã¯ã¢ããã«ãšã£ãŠçæ³çãªä¿åå ã§ããAmazon Web Services (AWS)ãGoogle Cloud Platform (GCP)ãMicrosoft Azureã®ãããªãããã€ããŒã¯ãéåžžã«èä¹ æ§ããããã¹ã±ãŒã©ãã«ã§è²»çšå¯Ÿå¹æã®é«ããªããžã§ã¯ãã¹ãã¬ãŒãžãµãŒãã¹ãæäŸããŠããŸãããããã®ãµãŒãã¹ã¯ãããã¯ã¢ããã¢ãŒã«ã€ãã®ä¿åã«æé©ã§ãã
`boto3` ã䜿çšããŠAmazon S3ã«ããã¯ã¢ãããã
Amazon S3 (Simple Storage Service) ã¯ãæã人æ°ã®ãããªããžã§ã¯ãã¹ãã¬ãŒãžãµãŒãã¹ã®äžã€ã§ãã`boto3` ã©ã€ãã©ãªã¯Pythonçšã®å ¬åŒAWS SDKã§ãããS3ãšã®é£æºã容æã«ããŸãã
ãŸãããããã€ã³ã¹ããŒã«ããŸãïŒ`pip install boto3`
ã»ãã¥ãªãã£æåªå ïŒ ã¹ã¯ãªããã«AWSèªèšŒæ å ±ãããŒãã³ãŒãããªãã§ãã ãããç°å¢å€æ°ïŒ`AWS_ACCESS_KEY_ID`ã`AWS_SECRET_ACCESS_KEY`ã`AWS_SESSION_TOKEN`ïŒãŸãã¯AWSèªèšŒæ å ±ãã¡ã€ã«ïŒ`~/.aws/credentials`ïŒã䜿çšããŠèšå®ããŸãã`boto3` ã¯ããããèªåçã«èŠã€ããŠäœ¿çšããŸãã
å®è·µäŸïŒS3ãã±ãããžã®ããã¯ã¢ãããã¡ã€ã«ã®ã¢ããããŒã
import boto3 from botocore.exceptions import ClientError import os # Configuration BUCKET_NAME = 'your-company-backup-bucket-name' # Must be globally unique LOCAL_FILE_PATH = '/var/backups/application/my_app_backup_2023-10-27.tar.gz' S3_OBJECT_KEY = f'application_backups/{os.path.basename(LOCAL_FILE_PATH)}' def upload_to_s3(file_path, bucket, object_name): """Upload a file to an S3 bucket""" # Create an S3 client. Boto3 will use credentials from the environment. s3_client = boto3.client('s3') try: print(f"Uploading {file_path} to S3 bucket {bucket} as {object_name}...") response = s3_client.upload_file(file_path, bucket, object_name) print("Upload successful.") return True except ClientError as e: print(f"An error occurred: {e}") return False except FileNotFoundError: print(f"The file was not found: {file_path}") return False # Execute the upload if __name__ == "__main__": upload_to_s3(LOCAL_FILE_PATH, BUCKET_NAME, S3_OBJECT_KEY)
ããã«ãS3ã®çµã¿èŸŒã¿æ©èœã§ããããŒãžã§ãã³ã°ã䜿çšããŠããã¯ã¢ããã®å±¥æŽãä¿æããããã©ã€ããµã€ã¯ã«ããªã·ãŒã䜿çšããŠå€ãããã¯ã¢ãããããå®äŸ¡ãªã¹ãã¬ãŒãžéå±€ïŒS3 Glacierãªã©ïŒã«èªåçã«ç§»åããããäžå®æéåŸã«åé€ãããããããšã§ãããã匷åã§ããŸãã
ä»ã®ã¯ã©ãŠããããã€ããŒãšã®é£æº
ä»ã®ã¯ã©ãŠããããã€ããŒã®ãã¿ãŒã³ãéåžžã«äŒŒãŠããŸããããããã®Python SDKã䜿çšããŸãã
- Google Cloud Storage: `google-cloud-storage` ã©ã€ãã©ãªã䜿çšããŸãã
- Microsoft Azure Blob Storage: `azure-storage-blob` ã©ã€ãã©ãªã䜿çšããŸãã
ãããã®å Žåããããã»ã¹ã«ã¯ã»ãã¥ã¢ãªèªèšŒãã¯ã©ã€ã¢ã³ããªããžã§ã¯ãã®äœæãããã³ `upload` ã¡ãœããã®åŒã³åºããå«ãŸããŸãããã®ã¢ãžã¥ã©ãŒã¢ãããŒãã«ãããå¿ èŠã«å¿ããŠã¯ã©ãŠãéäŸåã®ããã¯ã¢ããã¹ã¯ãªãããæ§ç¯ã§ããŸãã
å°éçãªããã¯ã¢ããïŒããŒã¿ããŒã¹ã®ä¿è·
皌åäžã®ããŒã¿ããŒã¹ãã¡ã€ã«ãåçŽã«ã³ããŒããããšã¯ãçœå®³ã®å ã§ããããŒã¿ããŒã¹ãã¡ã€ã«ã¯åžžã«æžã蟌ã¿ãè¡ãããŠãããããç Žæãããäžè²«æ§ã®ãªãããã¯ã¢ãããåŸãããå¯èœæ§ãã»ãŒç¢ºå®ã«ãããŸããä¿¡é Œæ§ã®é«ãããŒã¿ããŒã¹ããã¯ã¢ããã®ããã«ã¯ãããŒã¿ããŒã¹èªèº«ã®ãã€ãã£ãããã¯ã¢ããããŒã«ã䜿çšããå¿ èŠããããŸãã
PostgreSQLã®ããã¯ã¢ãã
PostgreSQLã®è«çããã¯ã¢ãããäœæããããã®ã³ãã³ãã©ã€ã³ãŠãŒãã£ãªãã£ã¯ `pg_dump` ã§ããããã¯ãããŒã¿ããŒã¹ãåäœæããããã«äœ¿çšã§ããSQLã³ãã³ãã®ã¹ã¯ãªãããçæããŸãããããPythonãã `subprocess` ã䜿çšããŠåŒã³åºãããšãã§ããŸãã
ã»ãã¥ãªãã£ã«é¢ããæ³šæïŒ ãã¹ã¯ãŒããã³ãã³ãã«çŽæ¥å«ããªãã§ãã ããã`.pgpass` ãã¡ã€ã«ãŸã㯠`PGPASSWORD` ã®ãããªç°å¢å€æ°ã䜿çšããŠãã ããã
å®è·µäŸïŒPostgreSQLããŒã¿ããŒã¹ã®ãã³ã
import subprocess import datetime import os # Database configuration DB_NAME = 'production_db' DB_USER = 'backup_user' DB_HOST = 'localhost' BACKUP_DIR = '/var/backups/postgres/' # Create a timestamped filename timestamp = datetime.datetime.now().strftime('%Y-%m-%d_%H-%M-%S') backup_file = os.path.join(BACKUP_DIR, f'{DB_NAME}_{timestamp}.sql') # Ensure the backup directory exists os.makedirs(BACKUP_DIR, exist_ok=True) # Set the PGPASSWORD environment variable for the subprocess env = os.environ.copy() env['PGPASSWORD'] = 'your_secure_password' # In production, get this from a secrets manager! command = [ 'pg_dump', f'--dbname={DB_NAME}', f'--username={DB_USER}', f'--host={DB_HOST}', f'--file={backup_file}' ] try: print(f"Starting PostgreSQL backup for database '{DB_NAME}'...") # We pass the modified environment to the subprocess subprocess.run(command, check=True, env=env, capture_output=True) print(f"Database backup successful. File created: {backup_file}") except subprocess.CalledProcessError as e: print("PostgreSQL backup failed.") print("Error:", e.stderr.decode())
MySQL/MariaDBã®ããã¯ã¢ãã
MySQLãŸãã¯MariaDBã®ããã»ã¹ãéåžžã«äŒŒãŠããã`mysqldump` ãŠãŒãã£ãªãã£ã䜿çšããŸããèªèšŒæ å ±ã«ã€ããŠã¯ããã¹ã¯ãŒãã®é²åºãé¿ããããã`~/.my.cnf` ã®ãããªãªãã·ã§ã³ãã¡ã€ã«ã䜿çšããã®ããã¹ããã©ã¯ãã£ã¹ã§ãã
å®è·µäŸïŒMySQLããŒã¿ããŒã¹ã®ãã³ã
import subprocess import datetime import os DB_NAME = 'production_db' DB_USER = 'backup_user' BACKUP_DIR = '/var/backups/mysql/' # For this to work without a password, create a .my.cnf file in the user's home directory: # [mysqldump] # user = backup_user # password = your_secure_password timestamp = datetime.datetime.now().strftime('%Y-%m-%d_%H-%M-%S') backup_file_path = os.path.join(BACKUP_DIR, f'{DB_NAME}_{timestamp}.sql') os.makedirs(BACKUP_DIR, exist_ok=True) command = [ 'mysqldump', f'--user={DB_USER}', DB_NAME ] try: print(f"Starting MySQL backup for database '{DB_NAME}'...") with open(backup_file_path, 'w') as f: subprocess.run(command, check=True, stdout=f, stderr=subprocess.PIPE) print(f"Database backup successful. File created: {backup_file_path}") except subprocess.CalledProcessError as e: print("MySQL backup failed.") print("Error:", e.stderr.decode())
SQLiteã®åŠç
SQLiteã¯ãµãŒããŒã¬ã¹ã®ãã¡ã€ã«ããŒã¹ããŒã¿ããŒã¹ã§ãããããã¯ããã«ã·ã³ãã«ã§ããPythonã®çµã¿èŸŒã¿ `sqlite3` ã¢ãžã¥ãŒã«ã«ã¯ã皌åäžã®ããŒã¿ããŒã¹ãäžæããããšãªãå®å šã«å¥ã®ãã¡ã€ã«ã«ã³ããŒã§ããå°çšã®ãªã³ã©ã€ã³ããã¯ã¢ããAPIããããŸãã
å®è·µäŸïŒSQLiteããŒã¿ããŒã¹ã®ããã¯ã¢ãã
import sqlite3 import shutil def backup_sqlite_db(db_path, backup_path): """Creates a backup of a live SQLite database.""" print(f"Backing up '{db_path}' to '{backup_path}'...") # Connect to the source database source_conn = sqlite3.connect(db_path) # Connect to the destination database (it will be created) backup_conn = sqlite3.connect(backup_path) try: with backup_conn: source_conn.backup(backup_conn) print("Backup successful.") except sqlite3.Error as e: print(f"Backup failed: {e}") finally: source_conn.close() backup_conn.close() # Usage backup_sqlite_db('/path/to/my_app.db', '/var/backups/sqlite/my_app_backup.db')
èªååãšã¹ã±ãžã¥ãŒãªã³ã°ïŒãèšå®ãããåŸã¯å¿ããããã®ã¢ãããŒã
ããã¯ã¢ããæŠç¥ã¯ãäžè²«ããŠå®è¡ãããŠåããŠå¹æãçºæ®ããŸããæåããã¯ã¢ããã¯å¿ããããã¡ã§ããèªååã¯ä¿¡é Œæ§ã®éµã§ãã
Cronãžã§ãã®äœ¿çšïŒLinux/macOSåãïŒ
Cronã¯Unixç³»ãªãã¬ãŒãã£ã³ã°ã·ã¹ãã ã«ãããæšæºçãªæéããŒã¹ã®ãžã§ãã¹ã±ãžã¥ãŒã©ã§ãã宿çãªã¹ã±ãžã¥ãŒã«ã§Pythonããã¯ã¢ããã¹ã¯ãªãããå®è¡ããcrontabãšã³ããªãäœæã§ããŸããcrontabãç·šéããã«ã¯ãã¿ãŒããã«ã§ `crontab -e` ãå®è¡ããŸãã
ã¹ã¯ãªãããæ¯æ¥åå2æ30åã«å®è¡ããcrontabãšã³ããªã®äŸïŒ
30 2 * * * /usr/bin/python3 /path/to/your/backup_script.py >> /var/log/backups.log 2>&1
ãã®ã³ãã³ãã¯ã¹ã¯ãªãããå®è¡ããæšæºåºåãšæšæºãšã©ãŒã®äž¡æ¹ããã°ãã¡ã€ã«ã«ãªãã€ã¬ã¯ãããŸããããã¯ç£èŠã«ãšã£ãŠéåžžã«éèŠã§ãã
Windowsã¿ã¹ã¯ã¹ã±ãžã¥ãŒã©ã®äœ¿çš
Windowsç°å¢ã§ã¯ãã¿ã¹ã¯ã¹ã±ãžã¥ãŒã©ã¯cronã«çžåœããçµã¿èŸŒã¿æ©èœã§ããã°ã©ãã£ã«ã«ã€ã³ã¿ãŒãã§ãŒã¹ãéããŠæ°ããã¿ã¹ã¯ãäœæããããªã¬ãŒïŒäŸïŒæ¯æ¥ç¹å®ã®æå»ïŒãæå®ããPythonã¹ã¯ãªãããå®è¡ããã¢ã¯ã·ã§ã³ãèšå®ã§ããŸãïŒ`python.exe C:\path\to\backup_script.py`ïŒã
`apscheduler` ã䜿çšããã¢ããªå ã¹ã±ãžã¥ãŒãªã³ã°
ããã¯ã¢ããããžãã¯ãé·æéå®è¡ãããPythonã¢ããªã±ãŒã·ã§ã³ã®äžéšã§ããå ŽåããŸãã¯Pythonå ã§å®å šã«ç®¡çãããã¯ãã¹ãã©ãããã©ãŒã ãœãªã¥ãŒã·ã§ã³ãå¿ èŠãªå Žåã¯ã`apscheduler` ã©ã€ãã©ãªãåªããéžæè¢ã§ãã
ãŸãããããã€ã³ã¹ããŒã«ããŸãïŒ`pip install apscheduler`
å®è·µäŸïŒæ¯æéããã¯ã¢ãã颿°ãå®è¡ããã·ã³ãã«ãªã¹ã±ãžã¥ãŒã©
from apscheduler.schedulers.blocking import BlockingScheduler import time def my_backup_job(): print(f"Performing backup job at {time.ctime()}...") # Insert your backup logic here (e.g., call the S3 upload function) scheduler = BlockingScheduler() # Schedule job to run every hour scheduler.add_job(my_backup_job, 'interval', hours=1) # Schedule job to run every day at 3:00 AM in a specific timezone scheduler.add_job(my_backup_job, 'cron', hour=3, minute=0, timezone='UTC') print("Scheduler started. Press Ctrl+C to exit.") try: scheduler.start() except (KeyboardInterrupt, SystemExit): pass
å ç¢ãªããã¯ã¢ããã·ã¹ãã ã®ããã®ãã¹ããã©ã¯ãã£ã¹
ã¹ã¯ãªãããæ§ç¯ããããšã¯ãæŠãã®ååã«éããŸããããããã®ãã¹ããã©ã¯ãã£ã¹ã«åŸãããšã§ãããã¯ã¢ããã·ã¹ãã ã¯åãªãã¹ã¯ãªããããå埩åã®ããããŒã¿ä¿è·æŠç¥ãžãšåäžããŸãã
- æå·åïŒ æ©å¯æ§ã®é«ãããã¯ã¢ããã¯ãç¹ã«ãªã¢ãŒããŸãã¯ã¯ã©ãŠãã®å Žæã«éä¿¡ããåã«ãåžžã«æå·åããŠãã ãããPythonã® `cryptography` ã©ã€ãã©ãªã¯ãããã«åŒ·åãªããŒã«ã§ããã¢ãŒã«ã€ããã¢ããããŒãããåã«æå·åã§ããŸãã
- ãã®ã³ã°ãšç£èŠïŒ ããã¯ã¢ããã¹ã¯ãªããã¯ããã®æŽ»åã®æç¢ºãªãã°ãçæãã¹ãã§ããäœããã©ãã«ããã¯ã¢ããããããããããŠæãéèŠãªããšã«ãçºçãããšã©ãŒãèšé²ããŠãã ãããããã¯ã¢ããã倱æããå Žåã«ããã«éç¥ãããããã«ãèªåéç¥ïŒäŸïŒã¡ãŒã«ãŸãã¯Slackã®ãããªã¡ãã»ãŒãžã³ã°ãã©ãããã©ãŒã çµç±ïŒãèšå®ããŠãã ããã
- ããã¯ã¢ããã®ãã¹ãïŒ ããã¯æãéèŠã§ãããæãèŠéããããã¡ãªã¹ãããã§ããããã¯ã¢ããã¯ãããããæ£åžžã«åŸ©å ã§ãããŸã§ããã¯ã¢ããã§ã¯ãããŸããã 宿çã«ã鿬çªç°å¢ã«ããã¯ã¢ããããããŒã¿ã埩å ããããšãããã¹ããã¹ã±ãžã¥ãŒã«ããŠãã ãããããã«ãããããã¯ã¢ãããç ŽæããŠããªãããšãããã³åŸ©å æé ãå®éã«æ©èœããããšã確èªã§ããŸãã
- å®å šãªèªèšŒæ å ±ç®¡çïŒ ãã®ç¹ãç¹°ãè¿ããŸãïŒãã¹ã¯ãŒããAPIããŒããŸãã¯ãã®ä»ã®ã·ãŒã¯ã¬ãããã³ãŒãã«çŽæ¥ããŒãã³ãŒãããªãã§ãã ãããç°å¢å€æ°ã`.env` ãã¡ã€ã«ïŒ`python-dotenv`ã䜿çšïŒããŸãã¯å°çšã®ã·ãŒã¯ã¬ãã管çãµãŒãã¹ïŒAWS Secrets ManagerãHashiCorp Vaultãªã©ïŒã䜿çšããŠãã ããã
- ããŒãžã§ãã³ã°ïŒ æ¯ååãããã¯ã¢ãããã¡ã€ã«ãäžæžãããªãã§ãã ãããè€æ°ã®ããŒãžã§ã³ïŒäŸïŒéå»1é±éã®æ¥æ¬¡ããã¯ã¢ãããéå»1ã¶æéã®é±æ¬¡ããã¯ã¢ããïŒãä¿æããŠãã ãããããã«ãããããŒã¿ç Žæãæ°æ¥éæ°ã¥ãããããã®ç Žæããç¶æ ã§å¿ å®ã«ããã¯ã¢ããããããããªç¶æ³ããä¿è·ãããŸãããã¡ã€ã«åã«ã¿ã€ã ã¹ã¿ã³ãã䜿çšããããšã¯ãã·ã³ãã«ãªããŒãžã§ãã³ã°ã®äžåœ¢æ ã§ãã
- åªçæ§ïŒ ã¹ã¯ãªãããè² ã®å¯äœçšãªãã«è€æ°åå®è¡ã§ããããšã確èªããŠãã ãããéäžã§å®è¡ã倱æããåå®è¡ããå Žåã§ããäžæãããšããããåéã§ãããããããã«ããçŽããããã«ãã¹ãã§ãã
- ãšã©ãŒåŠçïŒ ãããã¯ãŒã¯é害ãããŒããã·ã§ã³ãšã©ãŒããã£ã¹ã¯ã®æºæ¯ãã¯ã©ãŠããããã€ããŒããã®APIã¹ããããªã³ã°ãªã©ãæœåšçãªåé¡ãé©åã«åŠçããããã«ãã³ãŒãå ã«å æ¬ç㪠`try...except` ãããã¯ãæ§ç¯ããŠãã ããã
çµè«
ããŒã¿ä¿è·ã¯ãçŸä»£ã®ãœãããŠã§ã¢ãšã³ãžãã¢ãªã³ã°ãšã·ã¹ãã 管çã«ãããŠãè²ãããšã®ã§ããªãåŽé¢ã§ãããã®ã·ã³ãã«ãã匷åãªã©ã€ãã©ãªããããŠåºç¯ãªçµ±åæ©èœã«ãããPythonã¯ãã«ã¹ã¿ãã€ãºãããèªååãããå ç¢ãªããã¯ã¢ãããœãªã¥ãŒã·ã§ã³ãäœæããããã®åªããããŒã«ãšããŠéç«ã£ãŠããŸãã
åºæ¬çãª3-2-1ã«ãŒã«ããå§ããããŒã«ã«ããªã¢ãŒããããã³ã¯ã©ãŠãããŒã¹ã®æŠç¥ã段éçã«å®è£ ããããšã§ãå æ¬çãªããŒã¿ä¿è·ã·ã¹ãã ãæ§ç¯ã§ããŸãã`shutil` ã䜿çšããåºæ¬çãªãã¡ã€ã«æäœããã`rsync` ããã³ `paramiko` ã䜿çšããã»ãã¥ã¢ãªãªã¢ãŒã転éã`boto3` ã䜿çšããã¯ã©ãŠãçµ±åãããã³å°éçãªããŒã¿ããŒã¹ãã³ããŸã§ããã¹ãŠãç¶²çŸ ããŸãããèªååã¯äžè²«æ§ã確ä¿ããäžã§æå€§ã®å³æ¹ã§ããã峿 Œãªãã¹ããä¿¡é Œæ§ãä¿èšŒããå¯äžã®æ¹æ³ã§ããããšãå¿ããªãã§ãã ããã
ãŸãã¯ãéèŠãªãã£ã¬ã¯ããªãã¢ãŒã«ã€ãããŠã¯ã©ãŠãã«ã¢ããããŒãããã¹ã¯ãªããããå§ããŠãã·ã³ãã«ã«å§ããŸãããããã®åŸããã®ã³ã°ããšã©ãŒåŠçãéç¥ã段éçã«è¿œå ããŠãã ããã仿¥ã匷åºãªããã¯ã¢ããæŠç¥ã«æéãæè³ããããšã§ãææ¥ã®äžç¢ºå®æ§ããæã貎éãªããžã¿ã«è³ç£ãä¿è·ãããå埩åã®ããåºç€ãæ§ç¯ããããšã«ãªããŸãã