Bash Script For Mac To Download File From Internet

Active2 years, 11 months ago
  • Will download the file to /home/omio/Desktop and give it your NewFileName name. Improve this answer. Follow edited Jan 12 '18 at 14:46. Perhaps or in a bash script file. This would mean you don't have to stay awake at night and monitor until your download as (un)successfully run. Read this answer as well.
  • One can use it to download or transfer of data/files using many different protocols such as HTTP, HTTPS, FTP, SFTP and more. The curl command line utility lets you fetch a given URL or file from the bash shell. This page explains how to download files with curl command on a Linux, macOS,.BSD and Unix-like operating systems.
  • Bash scripts allow users to read files very effectively. The below example will showcase how to read a file using shell scripts. First, create a file called editors.txt with the following contents. This script will output each of the above 5 lines.
  • .NET 5.0 downloads for Linux, macOS, and Windows.NET is a free, cross-platform, open-source developer platform for building many different types of applications.

I have a minimal headless *nix which does not have any command line utilities for downloading files (e.g. no curl, wget, etc). I only have bash.

Sep 26, 2020 Below is a sample bash script on how to download a file, from a non-secured HTTP endpoint, without using wget or curl. It defines a bash function named get and use the pseudo-device to open the TCP connection. You can add this function to your.bashrc for convenience.

How can I download a file?

Ideally, I would like a solution that would work across a wide range of *nix.

Chris Snow
Chris SnowChris Snow
3,5784 gold badges21 silver badges30 bronze badges

If you have bash 2.04 or above with the /dev/tcp pseudo-device enabled, you can download a file from bash itself.

Paste the following code directly into a bash shell (you don't need to save the code into a file for executing):

Then you can execute it as from the shell as follows:

Source: Moreaki's answer upgrading and installing packages through the cygwin command line?

Update:as mentioned in the comment, the approach outlined above is simplistic:

  • the read will trashes backslashes and leading whitespace.
  • Bash can't deal with NUL bytes very nicely so binary files are out.
  • unquoted $line will glob.
Chris SnowBash script for mac to download file from internet securityChris Snow
3,5784 gold badges21 silver badges30 bronze badges

Use lynx.

It is pretty common for most of Unix/Linux.

-dump: dump the first file to stdout and exit

Or netcat:

Or telnet:


Bash Script For Mac To Download File From Internet

Adapted from Chris Snow answerThis can also handle binary transfer files

  • i break && cat to get out of read
  • i use http 1.0 so there's no need to wait for/send a connection:close

You can test binary files like this


Taking the 'just Bash and nothing else' strictly, here's one adaptation of earlier answers (@Chris's, @131's) that does not call any external utilities (not even standard ones) but also works with binary files:

Use with download http://path/to/file > file.

We deal with NUL bytes with read -d '. It reads until a NUL byte, and returns true if it found one, false if it didn't. Bash can't handle NUL bytes in strings, so when read returns with true, we add the NUL byte manually when printing, and when it returns false, we know there are no NUL bytes any more, and this should be the last piece of data.

Tested with Bash 4.4 on files with NULs in the middle, and ending in zero, one or two NULs, and also with the wget and curl binaries from Debian. The 373 kB wget binary took about 5.7 seconds to download. A speed of about 65 kB/s or a bit more than 512 kb/s.

In comparison, @131's cat-solution finishes in less than 0.1 s, or almost a hundred times faster. Not very surprising, really.

This is obviously silly, since without using external utilities, there's not much we can do with the downloaded file, not even make it executable.

108k15 gold badges180 silver badges309 bronze badges

If you have this package libwww-perl

You can simply use:


Based on @Chris Snow recipe. I made some improvements:

  • http scheme check (it only supports http)
  • http response validation (response status line check, and split header and body by 'rn' line, not 'Connection: close' which is not true sometimes)
  • failed on non-200 code (it's important to download files on the internet)

Here is code:

Yecheng FuYecheng Fu

Bash Script For Mac To Download File From Internet Security

Use uploading instead, via SSH from your local machine

A 'minimal headless *nix' box means you probably SSH into it. So you can also use SSH to upload to it. Which is functionally equivalent to downloading (of software packages etc.) except when you want a download command to include in a script on your headless server of course.

As shown in this answer, you would execute the following on your local machine to place a file on your remote headless server:

Faster uploading via SSH from a third machine

The disadvantage of the above solution compared to downloading is lower transfer speed, since the connection with your local machine usually has much less bandwidth than the connection between your headless server and other servers.

To solve that, you can of course execute the above command on another server with decent bandwidth. To make that more comfortable (avoiding a manual login on the third machine), here is a command to execute on your local machine.

To be secure, copy & paste that command including the leading space character' '. See the explanations below for the reason.


  • The command will ssh to your third machine intermediate-host, start downloading a file to there via wget, and start uploading it to target-host via SSH. Downloading and uploading use the bandwidth of your intermediate-host and happen at the same time (due to Bash pipe equivalents), so progress will be fast.

  • When using this, you have to replace the two server logins ([email protected]*-host), the target host password (yourpassword), the download URL (…) and the output path on your target host (/path/to/ with appropriate own values.

  • For the -T -e none SSH options when using it to transfer files, see these detailed explanations.

  • This command is meant for cases where you can't use SSH's public key authentication mechanism – it still happens with some shared hosting providers, notably Host Europe. To still automate the process, we rely on sshpass to be able to supply the password in the command. It requires sshpass to be installed on your intermediate host (sudo apt-get install sshpass under Ubuntu).

  • We try to use sshpass in a secure way, but it will still not be as secure as the SSH pubkey mechanism (says man sshpass). In particular, we supply the SSH password not as a command line argument but via a file, which is replaced by bash process substitution to make sure it never exists on disk. The printf is a bash built-in, making sure this part of the code does not pop up as a separate command in ps output as that would expose the password [source]. I think that this use of sshpass is just as secure as the sshpass -d<file-descriptor> variant recommended in man sshpass, because bash maps it internally to such a /dev/fd/* file descriptor anyway. And that without using a temp file [source]. But no guarantees, maybe I overlooked something.

  • Again to make the sshpass usage safe, we need to prevent the command from being recorded to the bash history on your local machine. For that, the whole command is prepended with one space character, which has this effect.

  • The -o StrictHostKeyChecking=no part prevents the command from failing in case it never connected to the target host. (Normally, SSH would then wait for user input to confirm the connection attempt. We make it proceed anyway.)

  • sshpass expects a ssh or scp command as its last argument. So we have to rewrite the typical wget -O - … ssh … command into a form without a bash pipe, as explained here.


Bash Script For Mac To Download File From Internet Using C#

Not the answer you're looking for? Browse other questions tagged bashcommand-lineweb or ask your own question.