PNG  IHDRX cHRMz&u0`:pQ<bKGD pHYsodtIME MeqIDATxw]Wug^Qd˶ 6`!N:!@xI~)%7%@Bh&`lnjVF29gΨ4E$|>cɚ{gk= %,a KX%,a KX%,a KX%,a KX%,a KX%,a KX%, b` ǟzeאfp]<!SJmɤY޲ڿ,%c ~ع9VH.!Ͳz&QynֺTkRR.BLHi٪:l;@(!MԴ=žI,:o&N'Kù\vRmJ雵֫AWic H@" !: Cé||]k-Ha oݜ:y F())u]aG7*JV@J415p=sZH!=!DRʯvɱh~V\}v/GKY$n]"X"}t@ xS76^[bw4dsce)2dU0 CkMa-U5tvLƀ~mlMwfGE/-]7XAƟ`׮g ewxwC4\[~7@O-Q( a*XGƒ{ ՟}$_y3tĐƤatgvێi|K=uVyrŲlLӪuܿzwk$m87k( `múcE)"@rK( z4$D; 2kW=Xb$V[Ru819קR~qloѱDyįݎ*mxw]y5e4K@ЃI0A D@"BDk_)N\8͜9dz"fK0zɿvM /.:2O{ Nb=M=7>??Zuo32 DLD@D| &+֎C #B8ַ`bOb $D#ͮҪtx]%`ES`Ru[=¾!@Od37LJ0!OIR4m]GZRJu$‡c=%~s@6SKy?CeIh:[vR@Lh | (BhAMy=݃  G"'wzn޺~8ԽSh ~T*A:xR[ܹ?X[uKL_=fDȊ؂p0}7=D$Ekq!/t.*2ʼnDbŞ}DijYaȲ(""6HA;:LzxQ‘(SQQ}*PL*fc\s `/d'QXW, e`#kPGZuŞuO{{wm[&NBTiiI0bukcA9<4@SӊH*؎4U/'2U5.(9JuDfrޱtycU%j(:RUbArLֺN)udA':uGQN"-"Is.*+k@ `Ojs@yU/ H:l;@yyTn}_yw!VkRJ4P)~y#)r,D =ě"Q]ci'%HI4ZL0"MJy 8A{ aN<8D"1#IJi >XjX֔#@>-{vN!8tRݻ^)N_╗FJEk]CT՟ YP:_|H1@ CBk]yKYp|og?*dGvzنzӴzjֺNkC~AbZƷ`.H)=!QͷVTT(| u78y֮}|[8-Vjp%2JPk[}ԉaH8Wpqhwr:vWª<}l77_~{s۴V+RCģ%WRZ\AqHifɤL36: #F:p]Bq/z{0CU6ݳEv_^k7'>sq*+kH%a`0ԣisqにtү04gVgW΂iJiS'3w.w}l6MC2uԯ|>JF5`fV5m`Y**Db1FKNttu]4ccsQNnex/87+}xaUW9y>ͯ骵G{䩓Գ3+vU}~jJ.NFRD7<aJDB1#ҳgSb,+CS?/ VG J?|?,2#M9}B)MiE+G`-wo߫V`fio(}S^4e~V4bHOYb"b#E)dda:'?}׮4繏`{7Z"uny-?ǹ;0MKx{:_pÚmFמ:F " .LFQLG)Q8qN q¯¯3wOvxDb\. BKD9_NN &L:4D{mm o^tֽ:q!ƥ}K+<"m78N< ywsard5+вz~mnG)=}lYݧNj'QJS{S :UYS-952?&O-:W}(!6Mk4+>A>j+i|<<|;ر^߉=HE|V#F)Emm#}/"y GII웻Jі94+v뾧xu~5C95~ūH>c@덉pʃ1/4-A2G%7>m;–Y,cyyaln" ?ƻ!ʪ<{~h~i y.zZB̃/,雋SiC/JFMmBH&&FAbϓO^tubbb_hZ{_QZ-sύodFgO(6]TJA˯#`۶ɟ( %$&+V'~hiYy>922 Wp74Zkq+Ovn錄c>8~GqܲcWꂎz@"1A.}T)uiW4="jJ2W7mU/N0gcqܗOO}?9/wìXžΏ0 >֩(V^Rh32!Hj5`;O28؇2#ݕf3 ?sJd8NJ@7O0 b־?lldщ̡&|9C.8RTWwxWy46ah嘦mh٤&l zCy!PY?: CJyв]dm4ǜҐR޻RլhX{FƯanшQI@x' ao(kUUuxW_Ñ줮[w8 FRJ(8˼)_mQ _!RJhm=!cVmm ?sFOnll6Qk}alY}; "baӌ~M0w,Ggw2W:G/k2%R,_=u`WU R.9T"v,<\Ik޽/2110Ӿxc0gyC&Ny޽JҢrV6N ``یeA16"J³+Rj*;BϜkZPJaÍ<Jyw:NP8/D$ 011z֊Ⱳ3ι֘k1V_"h!JPIΣ'ɜ* aEAd:ݺ>y<}Lp&PlRfTb1]o .2EW\ͮ]38؋rTJsǏP@芎sF\> P^+dYJLbJ C-xϐn> ι$nj,;Ǖa FU *择|h ~izť3ᤓ`K'-f tL7JK+vf2)V'-sFuB4i+m+@My=O҈0"|Yxoj,3]:cо3 $#uŘ%Y"y죯LebqtҢVzq¼X)~>4L׶m~[1_k?kxֺQ`\ |ٛY4Ѯr!)N9{56(iNq}O()Em]=F&u?$HypWUeB\k]JɩSع9 Zqg4ZĊo oMcjZBU]B\TUd34ݝ~:7ڶSUsB0Z3srx 7`:5xcx !qZA!;%͚7&P H<WL!džOb5kF)xor^aujƍ7 Ǡ8/p^(L>ὴ-B,{ۇWzֺ^k]3\EE@7>lYBȝR.oHnXO/}sB|.i@ɥDB4tcm,@ӣgdtJ!lH$_vN166L__'Z)y&kH;:,Y7=J 9cG) V\hjiE;gya~%ks_nC~Er er)muuMg2;֫R)Md) ,¶ 2-wr#F7<-BBn~_(o=KO㭇[Xv eN_SMgSҐ BS헃D%g_N:/pe -wkG*9yYSZS.9cREL !k}<4_Xs#FmҶ:7R$i,fi!~' # !6/S6y@kZkZcX)%5V4P]VGYq%H1!;e1MV<!ϐHO021Dp= HMs~~a)ަu7G^];git!Frl]H/L$=AeUvZE4P\.,xi {-~p?2b#amXAHq)MWǾI_r`S Hz&|{ +ʖ_= (YS(_g0a03M`I&'9vl?MM+m~}*xT۲(fY*V4x@29s{DaY"toGNTO+xCAO~4Ϳ;p`Ѫ:>Ҵ7K 3}+0 387x\)a"/E>qpWB=1 ¨"MP(\xp߫́A3+J] n[ʼnӼaTbZUWb={~2ooKױӰp(CS\S筐R*JغV&&"FA}J>G֐p1ٸbk7 ŘH$JoN <8s^yk_[;gy-;߉DV{c B yce% aJhDȶ 2IdйIB/^n0tNtџdcKj4϶v~- CBcgqx9= PJ) dMsjpYB] GD4RDWX +h{y`,3ꊕ$`zj*N^TP4L:Iz9~6s) Ga:?y*J~?OrMwP\](21sZUD ?ܟQ5Q%ggW6QdO+\@ ̪X'GxN @'4=ˋ+*VwN ne_|(/BDfj5(Dq<*tNt1х!MV.C0 32b#?n0pzj#!38}޴o1KovCJ`8ŗ_"]] rDUy޲@ Ȗ-;xџ'^Y`zEd?0„ DAL18IS]VGq\4o !swV7ˣι%4FѮ~}6)OgS[~Q vcYbL!wG3 7띸*E Pql8=jT\꘿I(z<[6OrR8ºC~ډ]=rNl[g|v TMTղb-o}OrP^Q]<98S¤!k)G(Vkwyqyr޽Nv`N/e p/~NAOk \I:G6]4+K;j$R:Mi #*[AȚT,ʰ,;N{HZTGMoּy) ]%dHء9Պ䠬|<45,\=[bƟ8QXeB3- &dҩ^{>/86bXmZ]]yޚN[(WAHL$YAgDKp=5GHjU&99v簪C0vygln*P)9^͞}lMuiH!̍#DoRBn9l@ xA/_v=ȺT{7Yt2N"4!YN`ae >Q<XMydEB`VU}u]嫇.%e^ánE87Mu\t`cP=AD/G)sI"@MP;)]%fH9'FNsj1pVhY&9=0pfuJ&gޤx+k:!r˭wkl03׼Ku C &ѓYt{.O.zҏ z}/tf_wEp2gvX)GN#I ݭ߽v/ .& и(ZF{e"=V!{zW`, ]+LGz"(UJp|j( #V4, 8B 0 9OkRrlɱl94)'VH9=9W|>PS['G(*I1==C<5"Pg+x'K5EMd؞Af8lG ?D FtoB[je?{k3zQ vZ;%Ɠ,]E>KZ+T/ EJxOZ1i #T<@ I}q9/t'zi(EMqw`mYkU6;[t4DPeckeM;H}_g pMww}k6#H㶏+b8雡Sxp)&C $@'b,fPߑt$RbJ'vznuS ~8='72_`{q纶|Q)Xk}cPz9p7O:'|G~8wx(a 0QCko|0ASD>Ip=4Q, d|F8RcU"/KM opKle M3#i0c%<7׿p&pZq[TR"BpqauIp$ 8~Ĩ!8Սx\ւdT>>Z40ks7 z2IQ}ItԀ<-%S⍤};zIb$I 5K}Q͙D8UguWE$Jh )cu4N tZl+[]M4k8֦Zeq֮M7uIqG 1==tLtR,ƜSrHYt&QP윯Lg' I,3@P'}'R˪e/%-Auv·ñ\> vDJzlӾNv5:|K/Jb6KI9)Zh*ZAi`?S {aiVDԲuy5W7pWeQJk֤#5&V<̺@/GH?^τZL|IJNvI:'P=Ϛt"¨=cud S Q.Ki0 !cJy;LJR;G{BJy޺[^8fK6)=yʊ+(k|&xQ2`L?Ȓ2@Mf 0C`6-%pKpm')c$׻K5[J*U[/#hH!6acB JA _|uMvDyk y)6OPYjœ50VT K}cǻP[ $:]4MEA.y)|B)cf-A?(e|lɉ#P9V)[9t.EiQPDѠ3ϴ;E:+Օ t ȥ~|_N2,ZJLt4! %ա]u {+=p.GhNcŞQI?Nd'yeh n7zi1DB)1S | S#ًZs2|Ɛy$F SxeX{7Vl.Src3E℃Q>b6G ўYCmtկ~=K0f(=LrAS GN'ɹ9<\!a`)֕y[uՍ[09` 9 +57ts6}b4{oqd+J5fa/,97J#6yν99mRWxJyѡyu_TJc`~W>l^q#Ts#2"nD1%fS)FU w{ܯ R{ ˎ󅃏џDsZSQS;LV;7 Od1&1n$ N /.q3~eNɪ]E#oM~}v֯FڦwyZ=<<>Xo稯lfMFV6p02|*=tV!c~]fa5Y^Q_WN|Vs 0ҘދU97OI'N2'8N֭fgg-}V%y]U4 峧p*91#9U kCac_AFңĪy뚇Y_AiuYyTTYЗ-(!JFLt›17uTozc. S;7A&&<ԋ5y;Ro+:' *eYJkWR[@F %SHWP 72k4 qLd'J "zB6{AC0ƁA6U.'F3:Ȅ(9ΜL;D]m8ڥ9}dU "v!;*13Rg^fJyShyy5auA?ɩGHRjo^]׽S)Fm\toy 4WQS@mE#%5ʈfFYDX ~D5Ϡ9tE9So_aU4?Ѽm%&c{n>.KW1Tlb}:j uGi(JgcYj0qn+>) %\!4{LaJso d||u//P_y7iRJ߬nHOy) l+@$($VFIQ9%EeKʈU. ia&FY̒mZ=)+qqoQn >L!qCiDB;Y<%} OgBxB!ØuG)WG9y(Ą{_yesuZmZZey'Wg#C~1Cev@0D $a@˲(.._GimA:uyw֬%;@!JkQVM_Ow:P.s\)ot- ˹"`B,e CRtaEUP<0'}r3[>?G8xU~Nqu;Wm8\RIkբ^5@k+5(By'L&'gBJ3ݶ!/㮻w҅ yqPWUg<e"Qy*167΃sJ\oz]T*UQ<\FԎ`HaNmڜ6DysCask8wP8y9``GJ9lF\G g's Nn͵MLN֪u$| /|7=]O)6s !ĴAKh]q_ap $HH'\1jB^s\|- W1:=6lJBqjY^LsPk""`]w)󭃈,(HC ?䔨Y$Sʣ{4Z+0NvQkhol6C.婧/u]FwiVjZka&%6\F*Ny#8O,22+|Db~d ~Çwc N:FuuCe&oZ(l;@ee-+Wn`44AMK➝2BRՈt7g*1gph9N) *"TF*R(#'88pm=}X]u[i7bEc|\~EMn}P瘊J)K.0i1M6=7'_\kaZ(Th{K*GJyytw"IO-PWJk)..axӝ47"89Cc7ĐBiZx 7m!fy|ϿF9CbȩV 9V-՛^pV̌ɄS#Bv4-@]Vxt-Z, &ֺ*diؠ2^VXbs֔Ìl.jQ]Y[47gj=幽ex)A0ip׳ W2[ᎇhuE^~q흙L} #-b۸oFJ_QP3r6jr+"nfzRJTUqoaۍ /$d8Mx'ݓ= OՃ| )$2mcM*cЙj}f };n YG w0Ia!1Q.oYfr]DyISaP}"dIӗթO67jqR ҊƐƈaɤGG|h;t]䗖oSv|iZqX)oalv;۩meEJ\!8=$4QU4Xo&VEĊ YS^E#d,yX_> ۘ-e\ "Wa6uLĜZi`aD9.% w~mB(02G[6y.773a7 /=o7D)$Z 66 $bY^\CuP. (x'"J60׿Y:Oi;F{w佩b+\Yi`TDWa~|VH)8q/=9!g߆2Y)?ND)%?Ǐ`k/sn:;O299yB=a[Ng 3˲N}vLNy;*?x?~L&=xyӴ~}q{qE*IQ^^ͧvü{Huu=R|>JyUlZV, B~/YF!Y\u_ݼF{_C)LD]m {H 0ihhadd nUkf3oٺCvE\)QJi+֥@tDJkB$1!Đr0XQ|q?d2) Ӣ_}qv-< FŊ߫%roppVBwü~JidY4:}L6M7f٬F "?71<2#?Jyy4뷢<_a7_=Q E=S1И/9{+93֮E{ǂw{))?maÆm(uLE#lïZ  ~d];+]h j?!|$F}*"4(v'8s<ŏUkm7^7no1w2ؗ}TrͿEk>p'8OB7d7R(A 9.*Mi^ͳ; eeUwS+C)uO@ =Sy]` }l8^ZzRXj[^iUɺ$tj))<sbDJfg=Pk_{xaKo1:-uyG0M ԃ\0Lvuy'ȱc2Ji AdyVgVh!{]/&}}ċJ#%d !+87<;qN޼Nفl|1N:8ya  8}k¾+-$4FiZYÔXk*I&'@iI99)HSh4+2G:tGhS^繿 Kتm0 вDk}֚+QT4;sC}rՅE,8CX-e~>G&'9xpW,%Fh,Ry56Y–hW-(v_,? ; qrBk4-V7HQ;ˇ^Gv1JVV%,ik;D_W!))+BoS4QsTM;gt+ndS-~:11Sgv!0qRVh!"Ȋ(̦Yl.]PQWgٳE'`%W1{ndΗBk|Ž7ʒR~,lnoa&:ü$ 3<a[CBݮwt"o\ePJ=Hz"_c^Z.#ˆ*x z̝grY]tdkP*:97YľXyBkD4N.C_[;F9`8& !AMO c `@BA& Ost\-\NX+Xp < !bj3C&QL+*&kAQ=04}cC!9~820G'PC9xa!w&bo_1 Sw"ܱ V )Yl3+ס2KoXOx]"`^WOy :3GO0g;%Yv㐫(R/r (s } u B &FeYZh0y> =2<Ϟc/ -u= c&׭,.0"g"7 6T!vl#sc>{u/Oh Bᾈ)۴74]x7 gMӒ"d]U)}" v4co[ ɡs 5Gg=XR14?5A}D "b{0$L .\4y{_fe:kVS\\O]c^W52LSBDM! C3Dhr̦RtArx4&agaN3Cf<Ԉp4~ B'"1@.b_/xQ} _߃҉/gٓ2Qkqp0շpZ2fԫYz< 4L.Cyυι1t@鎫Fe sYfsF}^ V}N<_`p)alٶ "(XEAVZ<)2},:Ir*#m_YӼ R%a||EƼIJ,,+f"96r/}0jE/)s)cjW#w'Sʯ5<66lj$a~3Kʛy 2:cZ:Yh))+a߭K::N,Q F'qB]={.]h85C9cr=}*rk?vwV렵ٸW Rs%}rNAkDv|uFLBkWY YkX מ|)1!$#3%y?pF<@<Rr0}: }\J [5FRxY<9"SQdE(Q*Qʻ)q1E0B_O24[U'],lOb ]~WjHޏTQ5Syu wq)xnw8~)c 쫬gٲߠ H% k5dƝk> kEj,0% b"vi2Wس_CuK)K{n|>t{P1򨾜j>'kEkƗBg*H%'_aY6Bn!TL&ɌOb{c`'d^{t\i^[uɐ[}q0lM˕G:‚4kb祔c^:?bpg… +37stH:0}en6x˟%/<]BL&* 5&fK9Mq)/iyqtA%kUe[ڛKN]Ě^,"`/ s[EQQm?|XJ߅92m]G.E΃ח U*Cn.j_)Tѧj̿30ڇ!A0=͜ar I3$C^-9#|pk!)?7.x9 @OO;WƝZBFU keZ75F6Tc6"ZȚs2y/1 ʵ:u4xa`C>6Rb/Yм)^=+~uRd`/|_8xbB0?Ft||Z\##|K 0>>zxv8۴吅q 8ĥ)"6>~\8:qM}#͚'ĉ#p\׶ l#bA?)|g g9|8jP(cr,BwV (WliVxxᡁ@0Okn;ɥh$_ckCgriv}>=wGzβ KkBɛ[˪ !J)h&k2%07δt}!d<9;I&0wV/ v 0<H}L&8ob%Hi|޶o&h1L|u֦y~󛱢8fٲUsւ)0oiFx2}X[zVYr_;N(w]_4B@OanC?gĦx>мgx>ΛToZoOMp>40>V Oy V9iq!4 LN,ˢu{jsz]|"R޻&'ƚ{53ўFu(<٪9:΋]B;)B>1::8;~)Yt|0(pw2N%&X,URBK)3\zz&}ax4;ǟ(tLNg{N|Ǽ\G#C9g$^\}p?556]/RP.90 k,U8/u776s ʪ_01چ|\N 0VV*3H鴃J7iI!wG_^ypl}r*jɤSR 5QN@ iZ#1ٰy;_\3\BQQ x:WJv츟ٯ$"@6 S#qe딇(/P( Dy~TOϻ<4:-+F`0||;Xl-"uw$Цi󼕝mKʩorz"mϺ$F:~E'ҐvD\y?Rr8_He@ e~O,T.(ފR*cY^m|cVR[8 JҡSm!ΆԨb)RHG{?MpqrmN>߶Y)\p,d#xۆWY*,l6]v0h15M˙MS8+EdI='LBJIH7_9{Caз*Lq,dt >+~ّeʏ?xԕ4bBAŚjﵫ!'\Ը$WNvKO}ӽmSşذqsOy?\[,d@'73'j%kOe`1.g2"e =YIzS2|zŐƄa\U,dP;jhhhaxǶ?КZ՚.q SE+XrbOu%\GتX(H,N^~]JyEZQKceTQ]VGYqnah;y$cQahT&QPZ*iZ8UQQM.qo/T\7X"u?Mttl2Xq(IoW{R^ ux*SYJ! 4S.Jy~ BROS[V|žKNɛP(L6V^|cR7i7nZW1Fd@ Ara{詑|(T*dN]Ko?s=@ |_EvF]׍kR)eBJc" MUUbY6`~V޴dJKß&~'d3i WWWWWW
Current Directory: /opt/saltstack/salt/lib/python3.10/site-packages/relenv/build
Viewing File: /opt/saltstack/salt/lib/python3.10/site-packages/relenv/build/common.py
# Copyright 2022-2025 Broadcom. # SPDX-License-Identifier: Apache-2 """ Build process common methods. """ import logging import os.path import hashlib import pathlib import glob import shutil import tarfile import tempfile import time import subprocess import random import sys import io import os import multiprocessing import pprint import re from html.parser import HTMLParser from relenv.common import ( DATA_DIR, LINUX, MODULE_DIR, RelenvException, build_arch, download_url, extract_archive, format_shebang, get_download_location, get_toolchain, get_triplet, runcmd, work_dirs, fetch_url, ) import relenv.relocate CHECK_VERSIONS_SUPPORT = True try: from packaging.version import InvalidVersion, parse from looseversion import LooseVersion except ImportError: CHECK_VERSIONS_SUPPORT = False log = logging.getLogger(__name__) GREEN = "\033[0;32m" YELLOW = "\033[1;33m" RED = "\033[0;31m" END = "\033[0m" MOVEUP = "\033[F" CICD = "CI" in os.environ NODOWLOAD = False RELENV_PTH = ( "import os; " "import sys; " "from importlib import util; " "from pathlib import Path; " "spec = util.spec_from_file_location(" "'relenv.runtime', str(Path(__file__).parent / 'site-packages' / 'relenv' / 'runtime.py')" "); " "mod = util.module_from_spec(spec); " "sys.modules['relenv.runtime'] = mod; " "spec.loader.exec_module(mod); mod.bootstrap();" ) SYSCONFIGDATA = """ import pathlib, sys, platform, os def build_arch(): machine = platform.machine() return machine.lower() def get_triplet(machine=None, plat=None): if not plat: plat = sys.platform if not machine: machine = build_arch() if plat == "darwin": return f"{machine}-macos" elif plat == "win32": return f"{machine}-win" elif plat == "linux": return f"{machine}-linux-gnu" else: raise RelenvException("Unknown platform {}".format(platform)) pydir = pathlib.Path(__file__).resolve().parent if sys.platform == "win32": DEFAULT_DATA_DIR = pathlib.Path.home() / "AppData" / "Local" / "relenv" else: DEFAULT_DATA_DIR = pathlib.Path.home() / ".local" / "relenv" if "RELENV_DATA" in os.environ: DATA_DIR = pathlib.Path(os.environ["RELENV_DATA"]).resolve() else: DATA_DIR = DEFAULT_DATA_DIR buildroot = pydir.parent.parent toolchain = DATA_DIR / "toolchain" / get_triplet() build_time_vars = {} for key in _build_time_vars: val = _build_time_vars[key] orig = val if isinstance(val, str): val = val.format( BUILDROOT=buildroot, TOOLCHAIN=toolchain, ) build_time_vars[key] = val """ def print_ui(events, processes, fails, flipstat=None): """ Prints the UI during the relenv building process. :param events: A dictionary of events that are updated during the build process :type events: dict :param processes: A dictionary of build processes :type processes: dict :param fails: A list of processes that have failed :type fails: list :param flipstat: A dictionary of process statuses, defaults to {} :type flipstat: dict, optional """ if flipstat is None: flipstat = {} if CICD: sys.stdout.flush() return uiline = [] for name in events: if not events[name].is_set(): status = " {}.".format(YELLOW) elif name in processes: now = time.time() if name not in flipstat: flipstat[name] = (0, now) if flipstat[name][1] < now: flipstat[name] = (1 - flipstat[name][0], now + random.random()) status = " {}{}".format(GREEN, " " if flipstat[name][0] == 1 else ".") elif name in fails: status = " {}\u2718".format(RED) else: status = " {}\u2718".format(GREEN) uiline.append(status) uiline.append(" " + END) sys.stdout.write("\r") sys.stdout.write("".join(uiline)) sys.stdout.flush() def verify_checksum(file, checksum): """ Verify the checksum of a files. :param file: The path to the file to check. :type file: str :param checksum: The checksum to verify against :type checksum: str :raises RelenvException: If the checksum verification failed :return: True if it succeeded, or False if the checksum was None :rtype: bool """ if checksum is None: log.error("Can't verify checksum because none was given") return False with open(file, "rb") as fp: file_checksum = hashlib.sha1(fp.read()).hexdigest() if checksum != file_checksum: raise RelenvException( f"sha1 checksum verification failed. expected={checksum} found={file_checksum}" ) return True def all_dirs(root, recurse=True): """ Get all directories under and including the given root. :param root: The root directory to traverse :type root: str :param recurse: Whether to recursively search for directories, defaults to True :type recurse: bool, optional :return: A list of directories found :rtype: list """ paths = [root] for root, dirs, files in os.walk(root): for name in dirs: paths.append(os.path.join(root, name)) return paths def populate_env(dirs, env): pass def build_default(env, dirs, logfp): """ The default build function if none is given during the build process. :param env: The environment dictionary :type env: dict :param dirs: The working directories :type dirs: ``relenv.build.common.Dirs`` :param logfp: A handle for the log file :type logfp: file """ cmd = [ "./configure", "--prefix={}".format(dirs.prefix), ] if env["RELENV_HOST"].find("linux") > -1: cmd += [ "--build={}".format(env["RELENV_BUILD"]), "--host={}".format(env["RELENV_HOST"]), ] runcmd(cmd, env=env, stderr=logfp, stdout=logfp) runcmd(["make", "-j8"], env=env, stderr=logfp, stdout=logfp) runcmd(["make", "install"], env=env, stderr=logfp, stdout=logfp) def build_openssl_fips(env, dirs, logfp): return build_openssl(env, dirs, logfp, fips=True) def build_openssl(env, dirs, logfp, fips=False): """ Build openssl. :param env: The environment dictionary :type env: dict :param dirs: The working directories :type dirs: ``relenv.build.common.Dirs`` :param logfp: A handle for the log file :type logfp: file """ arch = "aarch64" if sys.platform == "darwin": plat = "darwin64" if env["RELENV_HOST_ARCH"] == "x86_64": arch = "x86_64-cc" elif env["RELENV_HOST_ARCH"] == "arm64": arch = "arm64-cc" else: raise RelenvException(f"Unable to build {env['RELENV_HOST_ARCH']}") extended_cmd = [] else: plat = "linux" if env["RELENV_HOST_ARCH"] == "x86_64": arch = "x86_64" elif env["RELENV_HOST_ARCH"] == "aarch64": arch = "aarch64" else: raise RelenvException(f"Unable to build {env['RELENV_HOST_ARCH']}") extended_cmd = [ "-Wl,-z,noexecstack", ] if fips: extended_cmd.append("enable-fips") cmd = [ "./Configure", f"{plat}-{arch}", f"--prefix={dirs.prefix}", "--openssldir=/etc/ssl", "--libdir=lib", "--api=1.1.1", "--shared", "--with-rand-seed=os,egd", "enable-md2", "enable-egd", "no-idea", ] cmd.extend(extended_cmd) runcmd( cmd, env=env, stderr=logfp, stdout=logfp, ) runcmd(["make", "-j8"], env=env, stderr=logfp, stdout=logfp) if fips: shutil.copy( pathlib.Path("providers") / "fips.so", pathlib.Path(dirs.prefix) / "lib" / "ossl-modules", ) else: runcmd(["make", "install_sw"], env=env, stderr=logfp, stdout=logfp) def build_sqlite(env, dirs, logfp): """ Build sqlite. :param env: The environment dictionary :type env: dict :param dirs: The working directories :type dirs: ``relenv.build.common.Dirs`` :param logfp: A handle for the log file :type logfp: file """ # extra_cflags=('-Os ' # '-DSQLITE_ENABLE_FTS5 ' # '-DSQLITE_ENABLE_FTS4 ' # '-DSQLITE_ENABLE_FTS3_PARENTHESIS ' # '-DSQLITE_ENABLE_JSON1 ' # '-DSQLITE_ENABLE_RTREE ' # '-DSQLITE_TCL=0 ' # ) # configure_pre=[ # '--enable-threadsafe', # '--enable-shared=no', # '--enable-static=yes', # '--disable-readline', # '--disable-dependency-tracking', # ] cmd = [ "./configure", # "--with-shared", # "--without-static", "--enable-threadsafe", "--disable-readline", "--disable-dependency-tracking", "--prefix={}".format(dirs.prefix), # "--enable-add-ons=nptl,ports", ] if env["RELENV_HOST"].find("linux") > -1: cmd += [ "--build={}".format(env["RELENV_BUILD_ARCH"]), "--host={}".format(env["RELENV_HOST"]), ] runcmd(cmd, env=env, stderr=logfp, stdout=logfp) runcmd(["make", "-j8"], env=env, stderr=logfp, stdout=logfp) runcmd(["make", "install"], env=env, stderr=logfp, stdout=logfp) def tarball_version(href): if href.endswith("tar.gz"): try: x = href.split("-", 1)[1][:-7] if x != "latest": return x except IndexError: return None def sqlite_version(href): if "releaselog" in href: link = href.split("/")[1][:-5] return "{:d}{:02d}{:02d}00".format(*[int(_) for _ in link.split("_")]) def github_version(href): if "tag/" in href: return href.split("/v")[-1] def krb_version(href): if re.match(r"\d\.\d\d/", href): return href[:-1] def python_version(href): if re.match(r"(\d+\.)+\d/", href): return href[:-1] def uuid_version(href): if "download" in href and "latest" not in href: return href[:-16].rsplit("/")[-1].replace("libuuid-", "") def parse_links(text): class HrefParser(HTMLParser): hrefs = [] def handle_starttag(self, tag, attrs): if tag == "a": link = dict(attrs).get("href", "") if link: self.hrefs.append(link) parser = HrefParser() parser.feed(text) return parser.hrefs def check_files(name, location, func, current): fp = io.BytesIO() fetch_url(location, fp) fp.seek(0) text = fp.read().decode() loose = False try: current = parse(current) except InvalidVersion: current = LooseVersion(current) loose = True versions = [] for _ in parse_links(text): version = func(_) if version: if loose: versions.append(LooseVersion(version)) else: try: versions.append(parse(version)) except InvalidVersion: pass versions.sort() compare_versions(name, current, versions) def compare_versions(name, current, versions): for version in versions: try: if version > current: print(f"Found new version of {name} {version} > {current}") except TypeError: print(f"Unable to compare versions {version}") class Download: """ A utility that holds information about content to be downloaded. :param name: The name of the download :type name: str :param url: The url of the download :type url: str :param signature: The signature of the download, defaults to None :type signature: str :param destination: The path to download the file to :type destination: str :param version: The version of the content to download :type version: str :param sha1: The sha1 sum of the download :type sha1: str """ def __init__( self, name, url, fallback_url=None, signature=None, destination="", version="", checksum=None, checkfunc=None, checkurl=None, ): self.name = name self.url_tpl = url self.fallback_url_tpl = fallback_url self.signature_tpl = signature self.destination = destination self.version = version self.checksum = checksum self.checkfunc = checkfunc self.checkurl = checkurl def copy(self): return Download( self.name, self.url_tpl, self.fallback_url_tpl, self.signature_tpl, self.destination, self.version, self.checksum, self.checkfunc, self.checkurl, ) @property def url(self): return self.url_tpl.format(version=self.version) @property def fallback_url(self): if self.fallback_url_tpl: return self.fallback_url_tpl.format(version=self.version) @property def signature_url(self): return self.signature_tpl.format(version=self.version) @property def filepath(self): _, name = self.url.rsplit("/", 1) return pathlib.Path(self.destination) / name @property def formatted_url(self): return self.url.format(version=self.version) def fetch_file(self): """ Download the file. :return: The path to the downloaded content, and whether it was downloaded. :rtype: tuple(str, bool) """ try: return download_url(self.url, self.destination, CICD), True except Exception as exc: if self.fallback_url: print(f"Download failed {self.url} ({exc}); trying fallback url") return download_url(self.fallback_url, self.destination, CICD), True def fetch_signature(self, version): """ Download the file signature. :return: The path to the downloaded signature. :rtype: str """ return download_url(self.signature_url, self.destination, CICD) def exists(self): """ True when the artifact already exists on disk. :return: True when the artifact already exists on disk :rtype: bool """ return self.filepath.exists() def valid_hash(self): pass @staticmethod def validate_signature(archive, signature): """ True when the archive's signature is valid. :param archive: The path to the archive to validate :type archive: str :param signature: The path to the signature to validate against :type signature: str :return: True if it validated properly, else False :rtype: bool """ if signature is None: log.error("Can't check signature because none was given") return False try: runcmd( ["gpg", "--verify", signature, archive], stderr=subprocess.PIPE, stdout=subprocess.PIPE, ) return True except RelenvException as exc: log.error("Signature validation failed on %s: %s", archive, exc) return False @staticmethod def validate_checksum(archive, checksum): """ True when when the archive matches the sha1 hash. :param archive: The path to the archive to validate :type archive: str :param checksum: The sha1 sum to validate against :type checksum: str :return: True if the sums matched, else False :rtype: bool """ try: verify_checksum(archive, checksum) return True except RelenvException as exc: log.error("sha1 validation failed on %s: %s", archive, exc) return False def __call__(self, force_download=False, show_ui=False, exit_on_failure=False): """ Downloads the url and validates the signature and sha1 sum. :return: Whether or not validation succeeded :rtype: bool """ os.makedirs(self.filepath.parent, exist_ok=True) downloaded = False if force_download: _, downloaded = self.fetch_file() else: file_is_valid = False dest = get_download_location(self.url, self.destination) if self.checksum and os.path.exists(dest): file_is_valid = self.validate_checksum(dest, self.checksum) if file_is_valid: log.debug("%s already downloaded, skipping.", self.url) else: _, downloaded = self.fetch_file() valid = True if downloaded: if self.signature_tpl is not None: sig, _ = self.fetch_signature() valid_sig = self.validate_signature(self.filepath, sig) valid = valid and valid_sig if self.checksum is not None: valid_checksum = self.validate_checksum(self.filepath, self.checksum) valid = valid and valid_checksum if not valid: log.warning("Checksum did not match %s: %s", self.name, self.checksum) if show_ui: sys.stderr.write( f"\nChecksum did not match {self.name}: {self.checksum}\n" ) sys.stderr.flush() if exit_on_failure and not valid: sys.exit(1) return valid def check_version(self): if self.checkurl: url = self.checkurl else: url = self.url.rsplit("/", 1)[0] check_files(self.name, url, self.checkfunc, self.version) class Dirs: """ A container for directories during build time. :param dirs: A collection of working directories :type dirs: ``relenv.common.WorkDirs`` :param name: The name of this collection :type name: str :param arch: The architecture being worked with :type arch: str """ def __init__(self, dirs, name, arch, version): # XXX name is the specific to a step where as everything # else here is generalized to the entire build self.name = name self.version = version self.arch = arch self.root = dirs.root self.build = dirs.build self.downloads = dirs.download self.logs = dirs.logs self.sources = dirs.src self.tmpbuild = tempfile.mkdtemp(prefix="{}_build".format(name)) @property def toolchain(self): if sys.platform == "darwin": return get_toolchain(root=self.root) elif sys.platform == "win32": return get_toolchain(root=self.root) else: return get_toolchain(self.arch, self.root) @property def _triplet(self): if sys.platform == "darwin": return "{}-macos".format(self.arch) elif sys.platform == "win32": return "{}-win".format(self.arch) else: return "{}-linux-gnu".format(self.arch) @property def prefix(self): return self.build / f"{self.version}-{self._triplet}" def __getstate__(self): """ Return an object used for pickling. :return: The picklable state """ return { "name": self.name, "arch": self.arch, "root": self.root, "build": self.build, "downloads": self.downloads, "logs": self.logs, "sources": self.sources, "tmpbuild": self.tmpbuild, } def __setstate__(self, state): """ Unwrap the object returned from unpickling. :param state: The state to unpickle :type state: dict """ self.name = state["name"] self.arch = state["arch"] self.root = state["root"] self.downloads = state["downloads"] self.logs = state["logs"] self.sources = state["sources"] self.build = state["build"] self.tmpbuild = state["tmpbuild"] def to_dict(self): """ Get a dictionary representation of the directories in this collection. :return: A dictionary of all the directories :rtype: dict """ return { x: getattr(self, x) for x in [ "root", "prefix", "downloads", "logs", "sources", "build", "toolchain", ] } class Builds: """ Collection of builds. """ def __init__(self): self.builds = {} def add(self, platform, *args, **kwargs): if "builder" in kwargs: build = kwargs.pop("builder") if args or kwargs: raise RuntimeError( "builder keyword can not be used with other kwargs or args" ) else: build = Builder(*args, **kwargs) if platform not in self.builds: self.builds[platform] = {build.version: build} else: self.builds[platform][build.version] = build return build builds = Builds() class Builder: """ Utility that handles the build process. :param root: The root of the working directories for this build :type root: str :param recipies: The instructions for the build steps :type recipes: list :param build_default: The default build function, defaults to ``build_default`` :type build_default: types.FunctionType :param populate_env: The default function to populate the build environment, defaults to ``populate_env`` :type populate_env: types.FunctionType :param force_download: If True, forces downloading the archives even if they exist, defaults to False :type force_download: bool :param arch: The architecture being built :type arch: str """ def __init__( self, root=None, recipies=None, build_default=build_default, populate_env=populate_env, arch="x86_64", version="", ): self.root = root self.dirs = work_dirs(root) self.build_arch = build_arch() self.build_triplet = get_triplet(self.build_arch) self.arch = arch self.triplet = get_triplet(self.arch) self.version = version # XXX Refactor WorkDirs, Dirs and Builder so as not to duplicate logic self.prefix = self.dirs.build / f"{self.version}-{self.triplet}" self.sources = self.dirs.src self.downloads = self.dirs.download if recipies is None: self.recipies = {} else: self.recipies = recipies self.build_default = build_default self.populate_env = populate_env self.toolchains = get_toolchain(root=self.dirs.root) self.set_arch(self.arch) def copy(self, version, checksum): recipies = {} for name in self.recipies: _ = self.recipies[name] recipies[name] = { "build_func": _["build_func"], "wait_on": _["wait_on"], "download": _["download"].copy() if _["download"] else None, } build = Builder( self.root, recipies, self.build_default, self.populate_env, self.arch, version, ) build.recipies["python"]["download"].version = version build.recipies["python"]["download"].checksum = checksum return build def set_arch(self, arch): """ Set the architecture for the build. :param arch: The arch to build :type arch: str """ self.arch = arch self.triplet = get_triplet(self.arch) self.prefix = self.dirs.build / f"{self.version}-{self.triplet}" if sys.platform in ["darwin", "win32"]: self.toolchain = None else: self.toolchain = get_toolchain(self.arch, self.dirs.root) @property def _triplet(self): if sys.platform == "darwin": return "{}-macos".format(self.arch) elif sys.platform == "win32": return "{}-win".format(self.arch) else: return "{}-linux-gnu".format(self.arch) def add(self, name, build_func=None, wait_on=None, download=None): """ Add a step to the build process. :param name: The name of the step :type name: str :param build_func: The function that builds this step, defaults to None :type build_func: types.FunctionType, optional :param wait_on: Processes to wait on before running this step, defaults to None :type wait_on: list, optional :param download: A dictionary of download information, defaults to None :type download: dict, optional """ if wait_on is None: wait_on = [] if build_func is None: build_func = self.build_default if download is not None: download = Download(name, destination=self.downloads, **download) self.recipies[name] = { "build_func": build_func, "wait_on": wait_on, "download": download, } def run( self, name, event, build_func, download, show_ui=False, log_level="WARNING" ): """ Run a build step. :param name: The name of the step to run :type name: str :param event: An event to track this process' status and alert waiting steps :type event: ``multiprocessing.Event`` :param build_func: The function to use to build this step :type build_func: types.FunctionType :param download: The ``Download`` instance for this step :type download: ``Download`` :return: The output of the build function """ root_log = logging.getLogger(None) if sys.platform == "win32": if not show_ui: handler = logging.StreamHandler() handler.setLevel(logging.getLevelName(log_level)) root_log.addHandler(handler) for handler in root_log.handlers: if isinstance(handler, logging.StreamHandler): handler.setFormatter( logging.Formatter(f"%(asctime)s {name} %(message)s") ) if not self.dirs.build.exists(): os.makedirs(self.dirs.build, exist_ok=True) dirs = Dirs(self.dirs, name, self.arch, self.version) os.makedirs(dirs.sources, exist_ok=True) os.makedirs(dirs.logs, exist_ok=True) os.makedirs(dirs.prefix, exist_ok=True) while event.is_set() is False: time.sleep(0.3) logfp = io.open(os.path.join(dirs.logs, "{}.log".format(name)), "w") handler = logging.FileHandler(dirs.logs / f"{name}.log") root_log.addHandler(handler) root_log.setLevel(logging.NOTSET) # DEBUG: Uncomment to debug # logfp = sys.stdout cwd = os.getcwd() if download: extract_archive(dirs.sources, str(download.filepath)) dirs.source = dirs.sources / download.filepath.name.split(".tar")[0] os.chdir(dirs.source) else: os.chdir(dirs.prefix) if sys.platform == "win32": env = os.environ.copy() else: env = { "PATH": os.environ["PATH"], } env["RELENV_DEBUG"] = "1" env["RELENV_BUILDENV"] = "1" env["RELENV_HOST"] = self.triplet env["RELENV_HOST_ARCH"] = self.arch env["RELENV_BUILD"] = self.build_triplet env["RELENV_BUILD_ARCH"] = self.build_arch env["RELENV_PY_VERSION"] = self.recipies["python"]["download"].version env["RELENV_PY_MAJOR_VERSION"] = env["RELENV_PY_VERSION"].rsplit(".", 1)[0] if "RELENV_DATA" in os.environ: env["RELENV_DATA"] = os.environ["RELENV_DATA"] if self.build_arch != self.arch: native_root = DATA_DIR / "native" env["RELENV_NATIVE_PY"] = str(native_root / "bin" / "python3") self.populate_env(env, dirs) _ = dirs.to_dict() for k in _: log.info("Directory %s %s", k, _[k]) for k in env: log.info("Environment %s %s", k, env[k]) try: return build_func(env, dirs, logfp) except Exception: log.exception("Build failure") sys.exit(1) finally: os.chdir(cwd) log.removeHandler(handler) logfp.close() def cleanup(self): """ Clean up the build directories. """ shutil.rmtree(self.prefix) def clean(self): """ Completely clean up the remnants of a relenv build. """ # Clean directories for _ in [self.prefix, self.sources]: try: shutil.rmtree(_) except PermissionError: sys.stderr.write(f"Unable to remove directory: {_}") except FileNotFoundError: pass # Clean files archive = f"{self.prefix}.tar.xz" for _ in [archive]: try: os.remove(_) except FileNotFoundError: pass def download_files(self, steps=None, force_download=False, show_ui=False): """ Download all of the needed archives. :param steps: The steps to download archives for, defaults to None :type steps: list, optional """ if steps is None: steps = list(self.recipies) fails = [] processes = {} events = {} if show_ui: sys.stdout.write("Starting downloads \n") log.info("Starting downloads") if show_ui: print_ui(events, processes, fails) for name in steps: download = self.recipies[name]["download"] if download is None: continue event = multiprocessing.Event() event.set() events[name] = event proc = multiprocessing.Process( name=name, target=download, kwargs={ "force_download": force_download, "show_ui": show_ui, "exit_on_failure": True, }, ) proc.start() processes[name] = proc while processes: for proc in list(processes.values()): proc.join(0.3) # DEBUG: Comment to debug if show_ui: print_ui(events, processes, fails) if proc.exitcode is None: continue processes.pop(proc.name) if proc.exitcode != 0: fails.append(proc.name) if show_ui: print_ui(events, processes, fails) sys.stdout.write("\n") if fails and False: if show_ui: print_ui(events, processes, fails) sys.stderr.write("The following failures were reported\n") for fail in fails: sys.stderr.write(fail + "\n") sys.stderr.flush() sys.exit(1) def build(self, steps=None, cleanup=True, show_ui=False, log_level="WARNING"): """ Build! :param steps: The steps to run, defaults to None :type steps: list, optional :param cleanup: Whether to clean up or not, defaults to True :type cleanup: bool, optional """ # noqa: D400 fails = [] events = {} waits = {} processes = {} if show_ui: sys.stdout.write("Starting builds\n") # DEBUG: Comment to debug print_ui(events, processes, fails) log.info("Starting builds") for name in steps: event = multiprocessing.Event() events[name] = event kwargs = dict(self.recipies[name]) kwargs["show_ui"] = show_ui kwargs["log_level"] = log_level # Determine needed dependency recipies. wait_on = kwargs.pop("wait_on", []) for _ in wait_on[:]: if _ not in steps: wait_on.remove(_) waits[name] = wait_on if not waits[name]: event.set() proc = multiprocessing.Process( name=name, target=self.run, args=(name, event), kwargs=kwargs ) proc.start() processes[name] = proc # Wait for the processes to finish and check if we should send any # dependency events. while processes: for proc in list(processes.values()): proc.join(0.3) if show_ui: # DEBUG: Comment to debug print_ui(events, processes, fails) if proc.exitcode is None: continue processes.pop(proc.name) if proc.exitcode != 0: fails.append(proc.name) is_failure = True else: is_failure = False for name in waits: if proc.name in waits[name]: if is_failure: if name in processes: processes[name].terminate() time.sleep(0.1) waits[name].remove(proc.name) if not waits[name] and not events[name].is_set(): events[name].set() if fails: sys.stderr.write("The following failures were reported\n") last_outs = {} for fail in fails: log_file = self.dirs.logs / f"{fail}.log" try: with io.open(log_file) as fp: fp.seek(0, 2) end = fp.tell() ind = end - 4096 if ind > 0: fp.seek(ind) else: fp.seek(0) last_out = fp.read() if show_ui: sys.stderr.write("=" * 20 + f" {fail} " + "=" * 20 + "\n") sys.stderr.write(fp.read() + "\n\n") except FileNotFoundError: last_outs[fail] = f"Log file not found: {log_file}" log.error("Build step %s has failed", fail) log.error(last_out) if show_ui: sys.stderr.flush() if cleanup: log.debug("Performing cleanup.") self.cleanup() sys.exit(1) if show_ui: time.sleep(0.3) print_ui(events, processes, fails) sys.stdout.write("\n") sys.stdout.flush() if cleanup: log.debug("Performing cleanup.") self.cleanup() def check_prereqs(self): """ Check pre-requsists for build. This method verifies all requrements for a successful build are satisfied. :return: Returns a list of string describing failed checks :rtype: list """ fail = [] if self.toolchain and not self.toolchain.exists(): fail.append( f"Toolchain for {self.arch} does not exist. Please use relenv toolchain to obtain a toolchain." ) return fail def __call__( self, steps=None, arch=None, clean=True, cleanup=True, force_download=False, download_only=False, show_ui=False, log_level="WARNING", ): """ Set the architecture, define the steps, clean if needed, download what is needed, and build. :param steps: The steps to run, defaults to None :type steps: list, optional :param arch: The architecture to build, defaults to None :type arch: str, optional :param clean: If true, cleans the directories first, defaults to True :type clean: bool, optional :param cleanup: Cleans up after build if true, defaults to True :type cleanup: bool, optional :param force_download: Whether or not to download the content if it already exists, defaults to True :type force_download: bool, optional """ log = logging.getLogger(None) log.setLevel(logging.NOTSET) if not show_ui: handler = logging.StreamHandler() handler.setLevel(logging.getLevelName(log_level)) log.addHandler(handler) os.makedirs(self.dirs.logs, exist_ok=True) handler = logging.FileHandler(self.dirs.logs / "build.log") handler.setLevel(logging.INFO) log.addHandler(handler) if arch: self.set_arch(arch) if steps is None: steps = self.recipies failures = self.check_prereqs() if failures: for _ in failures: sys.stderr.write(f"{_}\n") sys.stderr.flush() sys.exit(1) if clean: self.clean() if self.build_arch != self.arch: native_root = DATA_DIR / "native" if not native_root.exists(): if "RELENV_NATIVE_PY_VERSION" in os.environ: version = os.environ["RELENV_NATIVE_PY_VERSION"] else: version = self.version from relenv.create import create create("native", DATA_DIR, version=version) # Start a process for each build passing it an event used to notify each # process if it's dependencies have finished. self.download_files(steps, force_download=force_download, show_ui=show_ui) if download_only: return self.build(steps, cleanup, show_ui=show_ui, log_level=log_level) def check_versions(self): success = True for step in list(self.recipies): download = self.recipies[step]["download"] if not download: continue if not download.check_version(): success = False return success def patch_shebang(path, old, new): """ Replace a file's shebang. :param path: The path of the file to patch :type path: str :param old: The old shebang, will only patch when this is found :type old: str :param name: The new shebang to be written :type name: str """ with open(path, "rb") as fp: try: data = fp.read(len(old.encode())).decode() except UnicodeError: return False except Exception as exc: log.warning("Unhandled exception: %r", exc) return False if data != old: log.warning("Shebang doesn't match: %s %r != %r", path, old, data) return False data = fp.read().decode() with open(path, "w") as fp: fp.write(new) fp.write(data) with open(path, "r") as fp: data = fp.read() log.info("Patched shebang of %s => %r", path, data) return True def patch_shebangs(path, old, new): """ Traverse directory and patch shebangs. :param path: The of the directory to traverse :type path: str :param old: The old shebang, will only patch when this is found :type old: str :param name: The new shebang to be written :type name: str """ for root, _dirs, files in os.walk(str(path)): for file in files: patch_shebang(os.path.join(root, file), old, new) def install_sysdata(mod, destfile, buildroot, toolchain): """ Create a Relenv Python environment's sysconfigdata. Helper method used by the `finalize` build method to create a Relenv Python environment's sysconfigdata. :param mod: The module to operate on :type mod: ``types.ModuleType`` :param destfile: Path to the file to write the data to :type destfile: str :param buildroot: Path to the root of the build :type buildroot: str :param toolchain: Path to the root of the toolchain :type toolchain: str """ data = {} fbuildroot = lambda _: _.replace(str(buildroot), "{BUILDROOT}") # noqa: E731 ftoolchain = lambda _: _.replace(str(toolchain), "{TOOLCHAIN}") # noqa: E731 # XXX: keymap is not used, remove it? # keymap = { # "BINDIR": (fbuildroot,), # "BINLIBDEST": (fbuildroot,), # "CFLAGS": (fbuildroot, ftoolchain), # "CPPLAGS": (fbuildroot, ftoolchain), # "CXXFLAGS": (fbuildroot, ftoolchain), # "datarootdir": (fbuildroot,), # "exec_prefix": (fbuildroot,), # "LDFLAGS": (fbuildroot, ftoolchain), # "LDSHARED": (fbuildroot, ftoolchain), # "LIBDEST": (fbuildroot,), # "prefix": (fbuildroot,), # "SCRIPTDIR": (fbuildroot,), # } for key in sorted(mod.build_time_vars): val = mod.build_time_vars[key] if isinstance(val, str): for _ in (fbuildroot, ftoolchain): val = _(val) log.info("SYSCONFIG [%s] %s => %s", key, mod.build_time_vars[key], val) data[key] = val with open(destfile, "w", encoding="utf8") as f: f.write( "# system configuration generated and used by" " the relenv at runtime\n" ) f.write("_build_time_vars = ") pprint.pprint(data, stream=f) f.write(SYSCONFIGDATA) def find_sysconfigdata(pymodules): """ Find sysconfigdata directory for python installation. :param pymodules: Path to python modules (e.g. lib/python3.10) :type pymodules: str :return: The name of the sysconig data module :rtype: str """ for root, dirs, files in os.walk(pymodules): for file in files: if file.find("sysconfigdata") > -1 and file.endswith(".py"): return file[:-3] def install_runtime(sitepackages): """ Install a base relenv runtime. """ relenv_pth = sitepackages / "relenv.pth" with io.open(str(relenv_pth), "w") as fp: fp.write(RELENV_PTH) # Lay down relenv.runtime, we'll pip install the rest later relenv = sitepackages / "relenv" os.makedirs(relenv, exist_ok=True) for name in [ "runtime.py", "relocate.py", "common.py", "buildenv.py", "__init__.py", ]: src = MODULE_DIR / name dest = relenv / name with io.open(src, "r") as rfp: with io.open(dest, "w") as wfp: wfp.write(rfp.read()) def finalize(env, dirs, logfp): """ Run after we've fully built python. This method enhances the newly created python with Relenv's runtime hacks. :param env: The environment dictionary :type env: dict :param dirs: The working directories :type dirs: ``relenv.build.common.Dirs`` :param logfp: A handle for the log file :type logfp: file """ # Run relok8 to make sure the rpaths are relocatable. relenv.relocate.main(dirs.prefix, log_file_name=str(dirs.logs / "relocate.py.log")) # Install relenv-sysconfigdata module libdir = pathlib.Path(dirs.prefix) / "lib" def find_pythonlib(libdir): for root, dirs, files in os.walk(libdir): for _ in dirs: if _.startswith("python"): return _ pymodules = libdir / find_pythonlib(libdir) cwd = os.getcwd() modname = find_sysconfigdata(pymodules) path = sys.path sys.path = [str(pymodules)] try: mod = __import__(str(modname)) finally: os.chdir(cwd) sys.path = path dest = pymodules / f"{modname}.py" install_sysdata(mod, dest, dirs.prefix, dirs.toolchain) # Lay down site customize bindir = pathlib.Path(dirs.prefix) / "bin" sitepackages = pymodules / "site-packages" install_runtime(sitepackages) # Install pip python = dirs.prefix / "bin" / "python3" if env["RELENV_HOST_ARCH"] != env["RELENV_BUILD_ARCH"]: env["RELENV_CROSS"] = dirs.prefix python = env["RELENV_NATIVE_PY"] logfp.write("\nRUN ENSURE PIP\n") runcmd( [str(python), "-m", "ensurepip"], env=env, stderr=logfp, stdout=logfp, ) # Fix the shebangs in the scripts python layed down. Order matters. shebangs = [ "#!{}".format(bindir / f"python{env['RELENV_PY_MAJOR_VERSION']}"), "#!{}".format( bindir / f"python{env['RELENV_PY_MAJOR_VERSION'].split('.', 1)[0]}" ), ] newshebang = format_shebang("/python3") for shebang in shebangs: log.info("Patch shebang %r with %r", shebang, newshebang) patch_shebangs( str(pathlib.Path(dirs.prefix) / "bin"), shebang, newshebang, ) if sys.platform == "linux": pyconf = f"config-{env['RELENV_PY_MAJOR_VERSION']}-{env['RELENV_HOST']}" patch_shebang( str(pymodules / pyconf / "python-config.py"), "#!{}".format(str(bindir / f"python{env['RELENV_PY_MAJOR_VERSION']}")), format_shebang("../../../bin/python3"), ) shutil.copy( pathlib.Path(dirs.toolchain) / env["RELENV_HOST"] / "sysroot" / "lib" / "libstdc++.so.6", libdir, ) # Moved in python 3.13 or removed? if (pymodules / "cgi.py").exists(): patch_shebang( str(pymodules / "cgi.py"), "#! /usr/local/bin/python", format_shebang("../../bin/python3"), ) def runpip(pkg, upgrade=False): logfp.write(f"\nRUN PIP {pkg} {upgrade}\n") target = None python = dirs.prefix / "bin" / "python3" if sys.platform == LINUX: if env["RELENV_HOST_ARCH"] != env["RELENV_BUILD_ARCH"]: target = pymodules / "site-packages" python = env["RELENV_NATIVE_PY"] cmd = [ str(python), "-m", "pip", "install", str(pkg), ] if upgrade: cmd.append("--upgrade") if target: cmd.append("--target={}".format(target)) runcmd(cmd, env=env, stderr=logfp, stdout=logfp) runpip("wheel") # This needs to handle running from the root of the git repo and also from # an installed Relenv if (MODULE_DIR.parent / ".git").exists(): runpip(MODULE_DIR.parent, upgrade=True) else: runpip("relenv", upgrade=True) globs = [ "/bin/python*", "/bin/pip*", "/bin/relenv", "/lib/python*/ensurepip/*", "/lib/python*/site-packages/*", "/include/*", "*.so", "/lib/*.so.*", "*.py", # Mac specific, factor this out "*.dylib", ] archive = f"{ dirs.prefix }.tar.xz" log.info("Archive is %s", archive) with tarfile.open(archive, mode="w:xz") as fp: create_archive(fp, dirs.prefix, globs, logfp) def create_archive(tarfp, toarchive, globs, logfp=None): """ Create an archive. :param tarfp: A pointer to the archive to be created :type tarfp: file :param toarchive: The path to the directory to archive :type toarchive: str :param globs: A list of filtering patterns to match against files to be added :type globs: list :param logfp: A pointer to the log file :type logfp: file """ if logfp is None: log.info("Current directory %s", os.getcwd()) log.info("Creating archive %s", tarfp.name) for root, _dirs, files in os.walk(toarchive): relroot = pathlib.Path(root).relative_to(toarchive) for f in files: relpath = relroot / f matches = False for g in globs: if glob.fnmatch.fnmatch("/" / relpath, g): matches = True break if matches: if logfp is None: log.info("Adding %s", relpath) tarfp.add(relpath, relpath, recursive=False) else: if logfp is None: log.info("Skipping %s", relpath)