Python Script Ambari timeout error

I am getting some errors during installation, launch and testing ,Python script has been killed due to timeout after waiting 900 secs

attached magazine here

stderr: 
Python script has been killed due to timeout after waiting 900 secs
 stdout:
2015-04-16 16:43:04,609 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;     curl -kf -x "" --retry 10     http://vagrant-centos65.vagrantup.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment': ..., 'not_if': 'test -e /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip', 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
2015-04-16 16:43:04,683 - Group['hadoop'] {'ignore_failures': False}
2015-04-16 16:43:04,685 - Adding group Group['hadoop']
2015-04-16 16:43:04,755 - Group['nobody'] {'ignore_failures': False}
2015-04-16 16:43:04,756 - Modifying group nobody
2015-04-16 16:43:04,816 - Group['users'] {'ignore_failures': False}
2015-04-16 16:43:04,817 - Modifying group users
2015-04-16 16:43:04,853 - Group['nagios'] {'ignore_failures': False}
2015-04-16 16:43:04,854 - Adding group Group['nagios']
2015-04-16 16:43:04,896 - Group['knox'] {'ignore_failures': False}
2015-04-16 16:43:04,896 - Adding group Group['knox']
2015-04-16 16:43:04,950 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'nobody']}
2015-04-16 16:43:04,951 - Modifying user nobody
2015-04-16 16:43:05,025 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:05,026 - Adding user User['hive']
2015-04-16 16:43:05,131 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
2015-04-16 16:43:05,131 - Adding user User['oozie']
2015-04-16 16:43:05,199 - User['nagios'] {'gid': 'nagios', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:05,200 - Adding user User['nagios']
2015-04-16 16:43:05,384 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
2015-04-16 16:43:05,385 - Adding user User['ambari-qa']
2015-04-16 16:43:05,504 - User['flume'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:05,505 - Adding user User['flume']
2015-04-16 16:43:05,612 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:05,613 - Adding user User['hdfs']
2015-04-16 16:43:05,682 - User['knox'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:05,683 - Adding user User['knox']
2015-04-16 16:43:05,738 - User['storm'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:05,738 - Adding user User['storm']
2015-04-16 16:43:05,806 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:05,806 - Adding user User['mapred']
2015-04-16 16:43:05,859 - User['hbase'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:05,860 - Adding user User['hbase']
2015-04-16 16:43:05,916 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
2015-04-16 16:43:05,917 - Adding user User['tez']
2015-04-16 16:43:05,972 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:05,972 - Adding user User['zookeeper']
2015-04-16 16:43:06,041 - User['kafka'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:06,042 - Adding user User['kafka']
2015-04-16 16:43:06,097 - User['falcon'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:06,098 - Adding user User['falcon']
2015-04-16 16:43:06,161 - User['sqoop'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:06,162 - Adding user User['sqoop']
2015-04-16 16:43:06,228 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:06,229 - Adding user User['yarn']
2015-04-16 16:43:06,287 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:06,287 - Adding user User['hcat']
2015-04-16 16:43:06,345 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2015-04-16 16:43:06,346 - Writing File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] because it doesn't exist
2015-04-16 16:43:06,346 - Changing permission for /var/lib/ambari-agent/data/tmp/changeUid.sh from 644 to 555
2015-04-16 16:43:06,347 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
2015-04-16 16:43:06,469 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2015-04-16 16:43:06,470 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/hadoop/hbase 2>/dev/null'] {'not_if': 'test $(id -u hbase) -gt 1000'}
2015-04-16 16:43:06,599 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'root', 'recursive': True}
2015-04-16 16:43:06,599 - Creating directory Directory['/etc/hadoop/conf.empty']
2015-04-16 16:43:06,600 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
2015-04-16 16:43:06,623 - Creating symbolic Link['/etc/hadoop/conf']
2015-04-16 16:43:06,651 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs'}
2015-04-16 16:43:06,651 - Writing File['/etc/hadoop/conf/hadoop-env.sh'] because it doesn't exist
2015-04-16 16:43:06,652 - Changing owner for /etc/hadoop/conf/hadoop-env.sh from 0 to hdfs
2015-04-16 16:43:06,684 - Repository['HDP-2.2'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.2.4.2', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'HDP', 'mirror_list': None}
2015-04-16 16:43:06,710 - File['/etc/yum.repos.d/HDP.repo'] {'content': Template('repo_suse_rhel.j2')}
2015-04-16 16:43:06,711 - Writing File['/etc/yum.repos.d/HDP.repo'] because it doesn't exist
2015-04-16 16:43:06,711 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos6', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2015-04-16 16:43:06,722 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': Template('repo_suse_rhel.j2')}
2015-04-16 16:43:06,722 - Writing File['/etc/yum.repos.d/HDP-UTILS.repo'] because it doesn't exist
2015-04-16 16:43:06,722 - Package['unzip'] {}
2015-04-16 16:43:07,144 - Skipping installing existent package unzip
2015-04-16 16:43:07,145 - Package['curl'] {}
2015-04-16 16:43:07,463 - Skipping installing existent package curl
2015-04-16 16:43:07,464 - Package['hdp-select'] {}
2015-04-16 16:43:07,724 - Installing package hdp-select ('/usr/bin/yum -d 0 -e 0 -y install hdp-select')
2015-04-16 16:43:11,315 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ;   curl -kf -x ""   --retry 10 http://vagrant-centos65.vagrantup.com:8080/resources//jdk-7u67-linux-x64.tar.gz -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz'] {'environment': ..., 'not_if': 'test -e /usr/jdk64/jdk1.7.0_67/bin/java', 'path': ['/bin', '/usr/bin/']}
2015-04-16 16:43:11,323 - Skipping Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ;   curl -kf -x ""   --retry 10 http://vagrant-centos65.vagrantup.com:8080/resources//jdk-7u67-linux-x64.tar.gz -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz'] due to not_if
2015-04-16 16:43:11,324 - Execute['mkdir -p /usr/jdk64 ; cd /usr/jdk64 ; tar -xf /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz > /dev/null 2>&1'] {'not_if': 'test -e /usr/jdk64/jdk1.7.0_67/bin/java', 'path': ['/bin', '/usr/bin/']}
2015-04-16 16:43:11,333 - Skipping Execute['mkdir -p /usr/jdk64 ; cd /usr/jdk64 ; tar -xf /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz > /dev/null 2>&1'] due to not_if
2015-04-16 16:43:11,564 - Package['hadoop_2_2_*-yarn'] {}
2015-04-16 16:43:11,825 - Installing package hadoop_2_2_*-yarn ('/usr/bin/yum -d 0 -e 0 -y install hadoop_2_2_*-yarn')

      

Im 'using ambari 1.7 and following this installation guide.

Any help would be greatly appreciated. Thanks to

+3


source to share


2 answers


With Ambari 2.2 (HDP2.3.4.0) I am having the same problem:

Python script was killed due to timeout after waiting 1800 seconds



Can be solved by setting timeout (agent.package.install.task.timeout = 1800) in / etc / ambari -server / conf / ambari.properties file

Floor

+1


source


I solved this problem,



Its a problem with the servers where all these packages are downloaded. Raises this error when the response from these servers takes too long. Although this error is not the problem creator. You can try other mirrors to install the package or manually, which you can install on your host. After that it ambari

will check if it is installed correctly or not. If installed correctly, it will automatically try to install the next package in the queue.

0


source







All Articles