202

I have a tmp.txt file containing variables to be exported, for example:

a=123
b="hello world"
c="one more variable"

How can I export all these variables using the export command, so that they can later be used by child processes?

Marco Bonelli
  • 436
  • 4
  • 16
Neerav
  • 2,915
  • 3
  • 15
  • 7

11 Answers11

367
set -a
. ./tmp.txt
set +a

set -a causes variables¹ defined from now on to be automatically exported. It's available in any Bourne-like shell. . is the standard and Bourne name for the source command so I prefer it for portability (source comes from csh and is now available in most modern Bourne-like shells including bash though (sometimes with a slightly different behaviour)).

In POSIX shells, you can also use set -o allexport as a more descriptive alternative way to write it (set +o allexport to unset).

You can make it a function with:

export_from() {
  # local is not a standard command but is pretty common. It's needed here
  # for this code to be re-entrant (for the case where sourced files to
  # call export_from). We still use _export_from_ prefix to namespace
  # those variables to reduce the risk of those variables being some of
  # those exported by the sourced file.
  local _export_from_ret _export_from_restore _export_from_file

  _export_from_ret=0

  # record current state of the allexport option. Some shells (ksh93/zsh)
  # have support for local scope for options, but there's no standard
  # equivalent.
  case $- in
    (*a*) _export_from_restore=;;
    (*)   _export_from_restore='set +a';;
  esac

  for _export_from_file do
    # using the command prefix removes the "special" attribute of the "."
    # command so that it doesn't exit the shell when failing.
    command . "$_export_from_file" || _export_from_ret="$?"
  done
  eval "$_export_from_restore"
  return "$_export_from_ret"
}

¹ In bash, beware that it also causes all functions declared while allexport is on to be exported to the environment (as BASH_FUNC_myfunction%% environment variables that are then imported by all bash shells run in that environment, even when running as sh).

Stéphane Chazelas
  • 522,931
  • 91
  • 1,010
  • 1,501
  • if variable's value has blank, the second run will failed – jk2K Nov 25 '19 at 14:01
  • 1
    What does set +a do? – learner Jun 24 '20 at 05:11
  • 2
    With set, - turns it on, and + turns it off. – Tim Keating Jun 11 '21 at 15:17
  • This was awesome, by the way. After struggling for an hour to figure out why manually exporting a bunch of variables wasn't working as expected, I simply added this flag and boom! Worked. Thanks! – Tim Keating Jun 11 '21 at 15:18
  • How was this not the accepted answer? Thank you for this. There's nothing better than an answer that not only works, but also teaches something along the way. Kudos. – Ryan McGeary Feb 23 '22 at 15:08
  • 1
    "set -a causes variables¹ defined from now on to be automatically exported." THANK YOU this is exactly what i was looking for a script that reads bash variable for envsubst – Fuseteam Jun 10 '22 at 17:09
130
source tmp.txt
export a b c
./child ...

Judging by your other question, you don't want to hardcode the variable names:

source tmp.txt
export $(cut -d= -f1 tmp.txt)

test it:

$ source tmp.txt
$ echo "$a $b $c"
123 hello world one more variable
$ perl -E 'say "@ENV{qw(a b c)}"'

$ export $(cut -d= -f1 tmp.txt)
$ perl -E 'say "@ENV{qw(a b c)}"'
123 hello world one more variable
glenn jackman
  • 84,176
  • 15
  • 116
  • 168
  • 6
    This won't work if the environment file contains comments, for example. (eg. files that can be reused by systemd's EnvironmentFile) – Chris Lamb Nov 12 '17 at 22:36
  • 8
    @ChrisLamb you can use `grep` to skip comments: `export $(grep --regexp ^[A-Z] tmp.txt | cut -d= -f1)` – gvee Jul 02 '18 at 16:05
  • 3
    The one-line version that ignore comments into your .env file `source .env && export $(sed '/^#/d' .env | cut -d= -f1)` – Francesco Bianco Jun 15 '21 at 13:21
10

A dangerous one-liner that doesn't require source:

export $(xargs <file)
  • It can't handle comments, frequently used in environment files
  • It can't handle values with whitespace, like in the question example
  • It may unintentionally expand glob patterns into files if they match by any chance

It's a bit dangerous because it passes the lines through bash expansion, but it has been useful to me when I know I have safe environment files.

villasv
  • 299
  • 3
  • 7
4

Just do:

while read LINE; do export "$LINE"; done < ./tmp.txt
  • 4
    This is brittle. It does not allow comments in the input file, does not handle quoted variable values properly, and fails on multi-line variables. Granted I don't have many multi-line variables, but I do use comments regularly and often need to use quotes for variable values. – Louis Jul 24 '19 at 15:53
  • @Louis: These issues can be fixed with little effort (see my answer below if you mind). The nice thing about this approach is that it is bash-only, single-line and no `source` processing (arbitrary script execution). – alecov Feb 01 '22 at 19:39
4

Just complementing @Stéphane Chazelas ' excellent answer you can also use set -a/set +a and its counterparts inside a file (eg. "to_export.bash") like this...

#!/usr/bin/env bash

set -a    
SOMEVAR_A="abcd"
SOMEVAR_B="efgh"
SOMEVAR_C=123456
set +a

... and then export all the variables contained in the file like this...

. ./to_export.bash

... or...

source ./to_export.bash

Thanks!

Eduardo Lucio
  • 664
  • 2
  • 13
  • 34
3

I've put together a solution that seems to be working in all cases (spaces, comments, etc) by using various proposed solutions. Here it is:

eval $(egrep "^[^#;]" .env | xargs -d'\n' -n1 | sed 's/^/export /')
mauridb
  • 131
  • 2
  • I turned this into a bash script and run as follows: ```. ./export_env_file.sh .env``` in order for this to work – William Le Oct 23 '22 at 14:03
2

This solution will export all key=values to environment variables that are in .env file, that are not empty lines or commented (#).

File: .env

ENV=local
DEBUG=True

Command:

$ export $(cat .env | egrep -v "(^#.*|^$)" | xargs)
  • 2
    values of multiple words must be quoted. This only removes lines which have # as first character on line, comments starting after a value are not removed – X Tian Mar 23 '21 at 11:40
1

My take:

dotenv() {
    local REPLY
    while read; do
        REPLY=$(printf %s\\n "${REPLY%%#*}" | xargs)
        [[ -n $REPLY ]] && export "$REPLY"
    done < <(envsubst)
}

Supports comments, processes spaces, quotes & backslashes (xargs-processing), and expands environment variables. Avoids arbitrary script execution from source.

Without xargs & envsubst, the syntax changes a bit (no unquoting or general post-processing), but comments are still supported and the function is bash-only.

Can be further improved to provide nicer error messages in bad lines.

alecov
  • 220
  • 2
  • 6
1

You don't need to run export on the content of a line, you just use it to mark the symbols you want to be exported. I use this function:

sourcery () {
    local file vars
    for file; do
            # shellcheck disable=SC1090
            source "$file" && {
                    mapfile -t vars < <(sed -nE '/^[[:space:]]*#/d;s/^[[:space:]]*([[:alpha:]_][[:alnum:]_]*)=.*/\1/p' "$file")
                    export "${vars[@]}"
            }
    done
}

This will source each file you pass to it and then read those files looking for uncommented variable declarations. It will mark each of the variables it finds as exported.

This is relatively safe because it relies on your shell to actually parse and resolve the content of the env file and only does string munging to figure out which symbols should get the export bit.

My version assumes bash and a sed supporting -E, but you could write a highly portable version of the same idea without too much trouble.

Sorpigal
  • 1,157
  • 9
  • 10
  • This works well, except for an issue I encountered i.e. I am unable to completely read strings that contain the dollar sign e.g. one of my password strings in an environment file contains a dollar sign. If I use single quotes on the password string, then I can read it.. However, if I use single quotes for every line, then the variable expansion from previously defined variables no longer happens... – Cogicero Aug 24 '22 at 01:55
  • P.S. I think I fixed this by using double quotes wherever variable expansion was needed, and single quotes elsewhere. – Cogicero Aug 24 '22 at 02:15
  • You are correct that single quoted strings disable interpolation of variables. That's true of everywhere in the shell and remains true in this scenario. – Sorpigal Aug 24 '22 at 13:54
1

For my use-case,

export $(< ~/my/.env)

works as desired.

Tilman Vogel
  • 159
  • 4
0

little workaround based on one of answers:

  1. create function and place it in ~/.bashrc
function myenvs() {

    if [ -z "$1" ]; then
        echo "Usage: myenvs [import file path]";
    else
        if [ -f "$1" ]; then
            source "$1" 2>/dev/null; export $(cat "$1" | grep "=" | grep -v "^#" | awk /./ | cut -d= -f1 | xargs)
        else
            echo "Bad file path: $1"
        fi
    fi
}
  1. run $ myenvs /path/to/env/file to import envs

※ if env/file has bad lines, here are errors may be appeared when source called. I just hide it, so error handling is up on you

rzlvmp
  • 101
  • 1