Train ML

A collection of machine learning tools

Setup

Utils

Splitter


source

get_splits

 get_splits (df:pandas.core.frame.DataFrame, stratified:str=None,
             group:str=None, nfold:int=5, seed:int=123)

Split samples in a dataframe based on Stratified, Group, or StratifiedGroup Kfold method

Type Default Details
df DataFrame df contains info for split
stratified str None colname to make stratified kfold; sampling from different groups
group str None colname to make group kfold; test and train are from different groups
nfold int 5
seed int 123
# read training data
df = pd.read_parquet('https://github.com/sky1ove/katlas_raw/raw/refs/heads/main/nbs/raw/combine_t5_kd.parquet').reset_index()

# read data contains info for split
info_df = Data.get_kinase_info().query('pseudo!="1"') # get non-pseudo kinase
# merge info with training data
info = df[['kinase']].merge(info_df)
info.head()
kinase ID_coral uniprot ID_HGNC group family subfamily_coral subfamily in_ST_paper in_Tyr_paper in_cddm pseudo pspa_category_small pspa_category_big cddm_big cddm_small length human_uniprot_sequence kinasecom_domain nucleus cytosol cytoskeleton plasma membrane mitochondrion Golgi apparatus endoplasmic reticulum vesicle centrosome aggresome main_location
0 SRC SRC P12931 SRC TK Src None Src 0 1 1 0 SRC SRC 1.0 2.0 536 MGSNKSKPKDASQRRRSLEPAENVHGAGGGAFPASQTPSKPASADGHRGPSAAFAPAAAEPKLFGGFNSSDTVTSPQRAGPLAGGVTTFVALYDYESRTETDLSFKKGERLQIVNNTEGDWWLAHSLSTGQTGYIPSNYVAPSDSIQAEEWYFGKITRRESERLLLNAENPRGTFLVRESETTKGAYCLSVSDFDNAKGLNVKHYKIRKLDSGGFYITSRTQFNSLQQLVAYYSKHADGLCHRLTTVCPTSKPQTQGLAKDAWEIPRESLRLEVKLGQGCFGEVWMGTWNGTTRVAIKTLKPGTMSPEAFLQEAQVMKKLRHEKLVQLYAVVSEEPIYIVTEYMSKGSLLDFLKGETGKYLRLPQLVDMAAQIASGMAYVERMNYVHRDLRAANILVGENLVCKVADFGLARLIEDNEYTARQGAKFPIKWTAPEAALYGRFTIKSDVWSFGILLTELTTKGRVPYPGMVNREVLDQVERGYRMPCPPECPESLHDLMCQCWRKEPEERPTFEYLQAFLEDYFTSTEPQYQPGENL LRLEVKLGQGCFGEVWMGTWNGTTRVAIKTLKPGTMSPEAFLQEAQVMKKLRHEKLVQLYAVVSEEPIYIVTEYMSKGSLLDFLKGETGKYLRLPQLVDMAAQIASGMAYVERMNYVHRDLRAANILVGENLVCKVADFGLARLIEDNEYTARQGAKFPIKWTAPEAALYGRFTIKSDVWSFGILLTELTTKGRVPYPGMVNREVLDQVERGYRMPCPPECPESLHDLMCQCWRKEPEERPTFEYLQAF NaN 2.0 NaN 6.0 NaN 2.0 NaN NaN NaN NaN plasma membrane
1 EPHA3 EphA3 P29320 EPHA3 TK Eph None Eph 0 1 1 0 Ephrin receptors Ephrin receptors 1.0 2.0 983 MDCQLSILLLLSCSVLDSFGELIPQPSNEVNLLDSKTIQGELGWISYPSHGWEEISGVDEHYTPIRTYQVCNVMDHSQNNWLRTNWVPRNSAQKIYVELKFTLRDCNSIPLVLGTCKETFNLYYMESDDDHGVKFREHQFTKIDTIAADESFTQMDLGDRILKLNTEIREVGPVNKKGFYLAFQDVGACVALVSVRVYFKKCPFTVKNLAMFPDTVPMDSQSLVEVRGSCVNNSKEEDPPRMYCSTEGEWLVPIGKCSCNAGYEERGFMCQACRPGFYKALDGNMKCAKCPPHSSTQEDGSMNCRCENNYFRADKDPPSMACTRPPSSPRNVISNINETSVILDWSWPLDTGGRKDVTFNIICKKCGWNIKQCEPCSPNVRFLPRQFGLTNTTVTVTDLLAHTNYTFEIDAVNGVSELSSPPRQFAAVSITTNQAAPSPVLTIKKDRTSRNSISLSWQEPEHPNGIILDYEVKYYEKQEQETSYTILRARGTNVTISSLKPDTIYVFQIRARTAAGYGTNSRKFEFETSPDSFSISGESSQVVMIAISAAVAIILLTVVIYVLIGRFCGYKSKHGADEKRLHFGNGHLKLPGLRTY... ISIDKVVGAGEFGEVCSGRLKLPSKKEISVAIKTLKVGYTEKQRRDFLGEASIMGQFDHPNIIRLEGVVTKSKPVMIVTEYMENGSLDSFLRKHDAQFTVIQLVGMLRGIASGMKYLSDMGYVHRDLAARNILINSNLVCKVSDFGLSRVLEDDPEAAYTTRGGKIPIRWTSPEAIAYRKFTSASDVWSYGIVLWEVMSYGERPYWEMSNQDVIKAVDEGYRLPPPMDCPAALYQLMLDCWQKDRNNRPKFEQIVSI NaN 1.0 NaN 6.0 NaN 3.0 NaN NaN NaN NaN plasma membrane
2 FES FES P07332 FES TK Fer None Fer 0 1 1 0 TAM receptors TAM receptors 1.0 2.0 822 MGFSSELCSPQGHGVLQQMQEAELRLLEGMRKWMAQRVKSDREYAGLLHHMSLQDSGGQSRAISPDSPISQSWAEITSQTEGLSRLLRQHAEDLNSGPLSKLSLLIRERQQLRKTYSEQWQQLQQELTKTHSQDIEKLKSQYRALARDSAQAKRKYQEASKDKDRDKAKDKYVRSLWKLFAHHNRYVLGVRAAQLHHQHHHQLLLPGLLRSLQDLHEEMACILKEILQEYLEISSLVQDEVVAIHREMAAAAARIQPEAEYQGFLRQYGSAPDVPPCVTFDESLLEEGEPLEPGELQLNELTVESVQHTLTSVTDELAVATEMVFRRQEMVTQLQQELRNEEENTHPRERVQLLGKRQVLQEALQGLQVALCSQAKLQAQQELLQTKLEHLGPGEPPPVLLLQDDRHSTSSSEQEREGGRTPTLEILKSHISGIFRPKFSLPPPLQLIPEVQKPLHEQLWYHGAIPRAEVAELLVHSGDFLVRESQGKQEYVLSVLWDGLPRHFIIQSLDNLYRLEGEGFPSIPLLIDHLLSTQQPLTKKSGVVLHRAVPKDKWVLNHEDLVLGEQIGRGNFGEVFSGRLRADNTLVAVKSCRETL... LVLGEQIGRGNFGEVFSGRLRADNTLVAVKSCRETLPPDLKAKFLQEARILKQYSHPNIVRLIGVCTQKQPIYIVMELVQGGDFLTFLRTEGARLRVKTLLQMVGDAAAGMEYLESKCCIHRDLAARNCLVTEKNVLKISDFGMSREEADGVYAASGGLRQVPVKWTAPEALNYGRYSSESDVWSFGILLWETFSLGASPYPNLSNQQTREFVEKGGRLPCPELCPDAVFRLMEQCWAYEPGQRPSFSTIYQELQS NaN 6.0 NaN 4.0 NaN NaN NaN NaN NaN NaN cytosol
3 NTRK3 TRKC Q16288 NTRK3 TK Trk None Trk 0 1 1 0 Insulin and neurotrophin receptors Insulin and neurotrophin receptors 1.0 3.0 839 MDVSLCPAKCSFWRIFLLGSVWLDYVGSVLACPANCVCSKTEINCRRPDDGNLFPLLEGQDSGNSNGNASINITDISRNITSIHIENWRSLHTLNAVDMELYTGLQKLTIKNSGLRSIQPRAFAKNPHLRYINLSSNRLTTLSWQLFQTLSLRELQLEQNFFNCSCDIRWMQLWQEQGEAKLNSQNLYCINADGSQLPLFRMNISQCDLPEISVSHVNLTVREGDNAVITCNGSGSPLPDVDWIVTGLQSINTHQTNLNWTNVHAINLTLVNVTSEDNGFTLTCIAENVVGMSNASVALTVYYPPRVVSLEEPELRLEHCIEFVVRGNPPPTLHWLHNGQPLRESKIIHVEYYQEGEISEGCLLFNKPTHYNNGNYTLIAKNPLGTANQTINGHFLKEPFPESTDNFILFDEVSPTPPITVTHKPEEDTFGVSIAVGLAAFACVLLVVLFVMINKYGRRSKFGMKGPVAVISGEEDSASPLHHINHGITTPSSLDAGPDTVVIGMTRIPVIENPQYFRQGHNCHKPDTYVQHIKRRDIVLKRELGEGAFGKVFLAECYNLSPTKDKMLVAVKALKDPTLAARKDFQREAELLTNLQ... IVLKRELGEGAFGKVFLAECYNLSPTKDKMLVAVKALKDPTLAARKDFQREAELLTNLQHEHIVKFYGVCGDGDPLIMVFEYMKHGDLNKFLRAHGPDAMILVDGQPRQAKGELGLSQMLHIASQIASGMVYLASQHFVHRDLATRNCLVGANLLVKIGDFGMSRDVYSTDYYRVGGHTMLPIRWMPPESIMYRKFTTESDVWSFGVILWEIFTYGKQPWFQLSNTEVIECITQGRVLERPRVCPKEVYDVMLGCWQREPQQRLNIKEIYKI NaN 4.0 NaN 4.0 NaN 2.0 NaN NaN NaN NaN cytosol
4 ALK ALK Q9UM73 ALK TK ALK None ALK 0 1 1 0 PDGF receptors PDGF receptors 1.0 3.0 1620 MGAIGLLWLLPLLLSTAAVGSGMGTGQRAGSPAAGPPLQPREPLSYSRLQRKSLAVDFVVPSLFRVYARDLLLPPSSSELKAGRPEARGSLALDCAPLLRLLGPAPGVSWTAGSPAPAEARTLSRVLKGGSVRKLRRAKQLVLELGEEAILEGCVGPPGEAAVGLLQFNLSELFSWWIRQGEGRLRIRLMPEKKASEVGREGRLSAAIRASQPRLLFQIFGTGHSSLESPTNMPSPSPDYFTWNLTWIMKDSFPFLSHRSRYGLECSFDFPCELEYSPPLHDLRNQSWSWRRIPSEEASQMDLLDGPGAERSKEMPRGSFLLLNTSADSKHTILSPWMRSSSEHCTLAVSVHRHLQPSGRYIAQLLPHNEAAREILLMPTPGKHGWTVLQGRIGRPDNPFRVALEYISSGNRSLSAVDFFALKNCSEGTSPGSKMALQSSFTCWNGTVLQLGQACDFHQDCAQGEDESQMCRKLPVGFYCNFEDGFCGWTQGTLSPHTPQWQVRTLKDARFQDHQDHALLLSTTDVPASESATVTSATFPAPIKSSPCELRMSWLIRGVLRGNVSLVLVENKTGKEQGRMVWHVAAYEGLSLWQWM... ITLIRGLGHGAFGEVYEGQVSGMPNDPSPLQVAVKTLPEVCSEQDELDFLMEALIISKFNHQNIVRCIGVSLQSLPRFILLELMAGGDLKSFLRETRPRPSQPSSLAMLDLLHVARDIACGCQYLEENHFIHRDIAARNCLLTCPGPGRVAKIGDFGMARDIYRASYYRKGGCAMLPVKWMPPEAFMEGIFTSKTDTWSFGVLLWEIFSLGYMPYPSKSNQEVLEFVTSGGRMDPPKNCPGPVYRIMTQCWQHQPEDRPNFAIILERIEY NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN None
# stratify samples based on group
splits = get_splits(info,stratified='group')
len(splits)
split0 = splits[0]
StratifiedKFold(n_splits=5, random_state=123, shuffle=True)
# kinase group in train set: 9
# kinase group in test set: 9
---------------------------
# kinase in train set: 312
---------------------------
# kinase in test set: 78
---------------------------
test set: ['EPHA3' 'FES' 'FLT3' 'FYN' 'EPHB1' 'EPHB3' 'FER' 'EPHB4' 'FLT4' 'FGFR1' 'EPHA5' 'TEK' 'DDR2' 'ZAP70' 'LIMK1' 'ULK3' 'JAK1' 'WEE1' 'TESK1' 'MAP2K3' 'AMPKA2' 'ATM' 'CAMK1D' 'CAMK2D' 'CAMK4' 'CAMKK1'
 'CK1D' 'CK1E' 'DYRK2' 'DYRK4' 'HGK' 'IKKE' 'JNK2' 'JNK3' 'KHS1' 'MAPKAPK5' 'MEK2' 'MSK2' 'NDR1' 'NEK6' 'NEK9' 'NIM1' 'NLK' 'OSR1' 'P38A' 'P38B' 'P90RSK' 'PAK1' 'PERK' 'PKCH' 'PKCI' 'PKN1' 'ROCK2'
 'RSK2' 'SIK' 'STLK3' 'TAK1' 'TSSK1' 'ALPHAK3' 'BMPR2' 'CDK10' 'CDK13' 'CDK14' 'CDKL5' 'GCN2' 'GRK4' 'IRE1' 'KHS2' 'MASTL' 'MLK4' 'MNK1' 'MRCKA' 'PRPK' 'QSK' 'SMMLCK' 'SSTK' 'ULK2' 'VRK1']
feat_col = df.columns[df.columns.str.startswith('T5_')]

target_col = df.columns[~df.columns.isin(feat_col)][1:]

source

split_data

 split_data (df:pandas.core.frame.DataFrame, feat_col:list,
             target_col:list, split:tuple)

Given split tuple, split dataframe into X_train, y_train, X_test, y_test

Type Details
df DataFrame dataframe of values
feat_col list feature columns
target_col list target columns
split tuple one of the split in splits
X_train, y_train, X_test, y_test = split_data(df,feat_col, target_col, split0)
X_train.shape,y_train.shape,X_test.shape,y_test.shape
((312, 1024), (312, 210), (78, 1024), (78, 210))

Scoring


source

score_each

 score_each (target:pandas.core.frame.DataFrame,
             pred:pandas.core.frame.DataFrame, absolute=True,
             verbose=True)

Calculate the overall MSE and average Pearson (per row) between two dataframes.

Type Default Details
target DataFrame target dataframe
pred DataFrame predicted dataframe
absolute bool True if absolute, take average with absolute values for pearson/spearman
verbose bool True whether or not display the error value
mse,pearson_avg,pearson_all = score_each(y_test, y_test)
overall MSE: 0.0000
Average Pearson: 1.0000 
pearson_all.head()
Pearson
3 1.0
8 1.0
10 1.0
19 1.0
24 1.0

Machine Learning

for regression task

Trainer


source

train_ml

 train_ml (df, feat_col, target_col, split, model, save=None, params={})

Fit and predict using sklearn model format, return target and pred of valid dataset.

Type Default Details
df dataframe of values
feat_col feature columns
target_col target columns
split one split in splits
model a sklearn models
save NoneType None file (.joblib) to save, e.g. ‘model.joblib’
params dict {} parameters for model.fit from sklearn
model = LinearRegression()

## Uncheck to run with saving model
# target,pred = train_ml(df, feat_col, target_col, split0, model,'model.joblib')

# Run without saving model
target,pred = train_ml(df, feat_col, target_col, split0, model)

pred.head()
LinearRegression()
-5P -5G -5A -5C -5S -5T -5V -5I -5L -5M -5F -5Y -5W -5H -5K -5R -5Q -5N -5D -5E -5s -5t -5y -4P -4G -4A -4C -4S -4T -4V -4I -4L -4M -4F -4Y -4W -4H -4K -4R -4Q -4N -4D -4E -4s -4t -4y -3P -3G -3A -3C -3S -3T -3V -3I -3L -3M -3F -3Y -3W -3H -3K -3R -3Q -3N -3D -3E -3s -3t -3y -2P -2G -2A -2C -2S -2T -2V -2I -2L -2M -2F -2Y -2W -2H -2K -2R -2Q -2N -2D -2E -2s -2t -2y -1P -1G -1A -1C -1S -1T -1V -1I -1L -1M -1F -1Y -1W -1H -1K -1R -1Q -1N -1D -1E -1s -1t -1y 1P 1G 1A 1C 1S 1T 1V 1I 1L 1M 1F 1Y 1W 1H 1K 1R 1Q 1N 1D 1E 1s 1t 1y 2P 2G 2A 2C 2S 2T 2V 2I 2L 2M 2F 2Y 2W 2H 2K 2R 2Q 2N 2D 2E 2s 2t 2y 3P 3G 3A 3C 3S 3T 3V 3I 3L 3M 3F 3Y 3W 3H 3K 3R 3Q 3N 3D 3E 3s 3t 3y 4P 4G 4A 4C 4S 4T 4V 4I 4L 4M 4F 4Y 4W 4H 4K 4R 4Q 4N 4D 4E 4s 4t 4y 0s 0t 0y
3 -0.364663 0.870487 0.991834 -1.121964 -0.290657 -0.833387 -0.307426 -0.324773 0.471477 -1.073343 -0.812458 -1.434746 -1.350085 -1.271085 1.800254 0.974699 0.116555 0.744288 1.985825 2.331658 -0.413107 -0.471487 -0.217772 -0.247955 1.292483 0.244955 -0.891183 -0.536666 -0.804704 0.428222 -0.086229 1.264924 -1.158653 -0.408948 -1.493813 -1.168260 -0.991477 0.758629 1.455745 -0.190496 -0.031264 1.957642 1.848958 0.213255 -0.796405 -0.658625 -0.061372 0.335267 1.229802 -1.199341 -0.880919 -1.180639 0.474184 -0.231715 1.040002 -1.320606 -0.500693 -1.444464 -1.220817 -0.538946 1.032913 -0.167847 0.402653 0.142351 1.606669 2.048291 0.609787 -0.409185 0.234544 -0.374190 0.594941 0.711258 -0.947312 -0.875054 -1.197036 -0.373912 -0.127618 0.149866 -1.376333 -0.821700 -1.700294 -1.061821 -1.076945 0.975284 1.280931 -0.216788 0.233037 2.334163 2.257660 0.741716 -0.343071 1.212989 -0.722006 0.433270 0.901180 -0.993873 -0.923401 -1.240499 0.247664 0.237635 1.307997 -1.448116 -0.210288 -1.602547 -1.224160 -0.771831 0.307597 -0.361454 0.193356 0.875600 2.765226 1.398041 0.030973 -0.140705 0.940240 -2.415863 1.794153 1.064604 -0.665626 -1.398913 -1.206202 2.164858 1.292418 0.896186 -1.059124 -0.072368 -1.666251 -1.448471 -0.671739 0.142903 -0.818297 0.341954 -0.693884 1.257289 2.420538 0.679763 -0.467378 0.529583 -0.201324 -0.046453 1.260534 -0.405472 -1.213894 -1.201127 0.671153 -0.063335 1.122904 -0.608907 0.330743 -1.727762 -1.080228 -0.310596 0.212781 0.484149 0.237102 0.537243 1.294348 1.613651 0.065598 -0.407538 -0.563436 -0.303898 0.010687 0.690049 -0.584386 -1.130735 -1.086324 1.374352 1.106567 3.509903 -0.697721 -0.028114 -1.350998 -1.099955 -0.896023 0.106905 0.872772 -0.538474 -0.628891 0.592968 0.662092 -0.040950 -0.359262 -0.180488 0.471792 0.807257 0.916204 -1.005127 -0.735023 -0.841264 0.429592 -0.462519 0.806542 -0.801919 -0.263507 -1.244926 -0.900742 -0.736697 1.254617 0.862551 -0.011679 0.108503 0.909324 1.577469 -0.199611 -0.255610 -0.685371 -0.775651 -0.584817 1.360464
8 0.095072 0.943352 0.534015 -1.163788 -0.481405 -0.721120 0.195005 -0.108232 0.560706 -1.158015 -0.261681 -1.330268 -1.198561 -0.972852 2.034068 0.764906 0.708835 0.480972 1.458156 1.805948 -0.680306 -0.982428 -0.521676 -0.020391 1.502458 0.386169 -0.937561 -0.552959 -0.670121 0.616497 -0.048519 1.117032 -0.955750 -0.441107 -1.386421 -0.926435 -0.689844 1.071099 0.465880 -0.054830 -0.092276 1.455418 1.364233 0.227066 -0.579443 -0.849440 -0.081970 0.733341 1.242354 -0.987211 -0.754404 -0.807296 0.640984 0.088058 1.170449 -0.965116 -0.627758 -1.389126 -1.427940 -0.674749 0.566315 0.060729 0.067466 0.200640 1.412241 1.654691 0.429896 -0.241506 -0.310555 0.579809 0.684449 0.772949 -1.067861 -0.861802 -1.025392 0.332118 0.291076 0.351917 -1.216779 -0.979973 -1.922491 -1.348519 -1.028992 1.264342 0.703046 -0.053192 0.238880 1.530311 2.147685 0.480055 -0.207175 0.334262 -1.038308 -0.192368 0.645983 -1.011880 -1.106730 -1.010666 0.659819 0.684883 2.409362 -1.372949 -0.194003 -1.702078 -1.257299 -0.950375 0.281790 -0.061181 -0.516340 0.883897 2.911547 1.839036 -0.180963 0.028267 0.249979 -0.422785 1.517371 1.048142 -0.698999 -1.233630 -1.222296 1.271998 0.667650 0.366654 -1.237536 -0.441708 -1.599890 -1.126246 -0.805127 0.035418 -0.749247 0.394294 -0.745907 1.498242 3.478742 0.456024 -0.345219 -0.105203 -0.084199 -0.025061 0.499552 -0.922343 -1.031462 -1.100048 1.003769 0.131931 0.733304 -0.846473 0.004175 -1.608343 -1.262702 -0.587479 0.846297 -0.525767 -0.376103 -0.449082 2.384981 2.438125 0.198051 -0.255557 0.835180 -0.018520 0.523884 0.851703 -0.650948 -0.949875 -0.899370 0.841845 0.456867 2.260202 -0.808053 -0.018017 -1.237804 -0.988369 -0.562143 -0.138822 0.506776 -0.094963 -0.265247 1.148406 0.759710 -0.330317 -0.420630 0.034115 0.114871 0.928895 0.903226 -0.970254 -0.439692 -0.783941 0.780267 -0.195480 0.926289 -0.812014 -0.128280 -1.169767 -0.872822 -0.868972 0.933066 0.325221 -0.203409 -0.000786 1.353711 1.829013 -0.594087 -0.717604 -0.338252 -0.582574 -0.741412 1.323958
10 0.563839 1.253888 1.572644 -1.588620 -0.697072 -0.664016 -0.333674 -1.140501 -0.300929 -1.331911 -1.583480 -1.799933 -1.198352 0.126503 2.050003 0.708615 0.193843 0.541931 2.315354 2.606576 -0.373979 -0.612467 -0.308787 1.072120 1.933547 0.675078 -0.953403 -0.506404 -0.833606 0.014510 -0.595161 0.768042 -0.874725 -0.677491 -1.582819 0.236086 -0.751672 0.786535 0.314760 -0.207827 -0.064260 2.431253 1.897663 -0.412307 -1.699080 -0.971404 1.643257 1.129853 1.287495 -1.394400 -0.554565 -0.749361 0.208812 -0.510521 1.248886 -0.708512 0.044477 -1.257033 -1.300735 -1.083353 0.051773 0.698143 0.280090 0.156637 1.561516 2.185947 -0.499632 -1.337549 -1.100874 1.096746 2.031115 1.024020 -1.309198 -0.552031 -0.950508 -0.194722 -0.133484 0.804600 -0.752779 -1.092868 -1.784572 -1.628407 -0.320507 0.668965 0.223615 0.171455 0.411990 1.749719 2.801573 -0.002060 -1.255408 -1.006296 0.650334 1.051969 1.291957 -1.152625 -0.810718 -0.993471 -0.093003 -0.321634 1.024997 -0.824425 -0.191669 -1.430274 -1.442311 -0.175871 -0.193080 -0.423969 0.585234 1.027001 1.929195 2.275343 -1.159789 -1.326967 0.704207 -0.083205 0.854318 1.289939 -0.853477 -0.820926 -0.842403 2.699846 1.306516 0.595033 -0.850548 0.032011 -1.223289 -1.341007 -0.424514 -0.135504 -1.275982 0.901767 0.068808 0.878630 1.679459 -1.356364 -2.118935 1.019273 0.438654 0.950586 1.400403 -0.556207 -0.963658 -1.033390 0.312432 0.051808 0.726317 -0.624152 0.215556 -1.329723 -0.601423 -0.893652 -0.260632 -1.115364 0.681550 0.779566 3.042307 2.135977 -1.304345 -1.777677 -0.275492 1.312854 1.131627 0.727194 -0.734407 -0.961960 -1.039300 1.119756 1.034252 3.469051 -0.263320 0.228283 -1.789867 -1.532882 -1.562832 -1.045330 -0.542054 -0.354468 0.223724 0.778806 1.096978 -0.342373 -0.798702 -0.155351 1.346906 0.894603 1.049744 -1.305460 -0.674459 -0.970559 0.130576 -0.306629 1.078808 -0.767426 -0.022721 -1.339209 -1.357529 -0.905373 0.891923 1.195551 -0.349163 0.053950 1.101158 1.518578 -0.623063 -0.590557 -0.049050 -0.822975 -0.972125 1.795120
19 -0.345382 0.793358 1.339868 -1.013834 -0.150577 -0.959647 0.355832 0.541212 1.640415 -0.968193 -0.745335 -1.560235 -1.612303 -0.562593 1.528853 0.152379 0.117539 -0.150632 2.011172 1.979279 -0.541217 -1.077432 -0.772270 -0.117384 1.887679 0.863820 -0.919661 -0.601416 -1.020145 0.254879 -0.131948 1.382426 -0.552347 -0.078879 -1.364123 -1.020211 -1.212620 0.800576 1.010589 -0.530255 -0.166166 1.640291 1.809219 0.019190 -0.891971 -1.061263 -0.185716 1.275046 1.841397 -1.456876 -0.913087 -1.140050 0.559102 -0.154603 1.591858 -1.045307 -0.098351 -1.351341 -0.968525 -1.044010 1.314789 0.093477 0.163769 -0.215263 0.901547 2.001557 0.411314 -0.925916 -0.654984 0.179642 1.121340 0.831901 -1.426234 -0.869804 -1.135396 0.344422 0.164559 0.146026 -1.291108 -0.544701 -1.222514 -0.747993 -1.042427 1.757940 0.195541 -0.157202 -0.090089 1.723758 2.310650 0.441835 -0.805616 0.115001 -0.369292 0.002692 1.152909 -0.793860 -0.962932 -0.882195 1.532767 1.159689 2.773156 -0.995852 0.417409 -1.197087 -0.784751 -1.019102 0.785648 -1.036119 0.028618 -0.038557 1.170680 0.420497 -0.599014 -0.156550 -0.608964 -1.651130 1.384090 1.177131 -1.034129 -0.847538 -1.013251 1.911686 0.737655 0.345259 -1.111957 0.292000 -2.019084 -1.670378 -0.601892 0.583477 -0.544753 0.610765 0.281809 1.495013 3.113074 -0.204641 -1.193882 -0.039051 -0.302140 0.592020 1.178154 -1.461003 -1.177476 -1.262473 1.103809 0.439654 1.534357 -0.941723 0.443396 -1.412209 -0.858716 -1.099370 -0.343173 0.118309 -0.132197 0.061354 1.652429 2.182911 -0.581494 -1.165498 1.431351 0.049574 0.418317 0.651266 -1.012795 -1.228814 -1.290874 1.634865 1.322878 3.739813 -0.531335 0.441546 -1.730245 -1.325678 -1.216708 -0.131361 0.613018 -1.021587 -0.950824 0.839786 1.268780 -0.181957 -0.366305 0.008801 0.311004 1.188993 1.496121 -1.064321 -0.815558 -1.064153 1.158995 -0.260283 1.226198 -0.641580 -0.407418 -1.625862 -1.263432 -0.796712 0.830081 0.066254 -0.052416 0.005698 1.219596 1.779481 -0.375770 -0.839007 -0.076203 -0.847138 -0.717033 1.564161
24 0.176033 0.741688 0.991692 -1.113790 -0.412446 -0.791945 0.128496 -0.202053 -0.113824 -1.339233 -0.624250 -1.806232 -1.555979 -1.377404 1.704786 0.343696 0.014072 -0.062933 2.440986 2.226345 0.418376 0.113736 0.100893 -0.527166 0.933825 0.392002 -0.941578 -0.704239 -0.887847 0.382819 -0.311493 0.793274 -1.238408 -0.678087 -1.692941 -1.849401 -0.866391 0.882327 1.105737 -0.170421 -0.146371 2.465253 2.227617 0.838525 -0.005350 -0.000928 0.318767 0.262909 0.795245 -1.192453 -0.700834 -0.840046 0.151631 -0.585464 0.623365 -1.378155 -0.448412 -1.331188 -1.555552 -0.634506 0.984850 -1.451043 0.482964 0.390823 2.449124 3.598372 0.403804 -0.588716 0.244046 0.116811 0.766701 0.311579 -1.280278 -0.925376 -1.180196 -0.293008 -0.632813 -0.248395 -1.599542 -0.992108 -1.182022 -1.608863 -0.716496 0.951618 -0.071149 -1.043482 0.817226 3.070252 3.075492 1.161942 0.404138 1.096686 -0.776611 0.127805 0.296811 -0.936477 -1.385628 -1.284595 0.885734 1.251156 1.902369 -1.655065 -0.687011 -2.020199 -1.894475 -0.874229 -0.025291 -0.306773 -0.372157 0.024541 2.315833 1.801416 1.029931 1.136409 1.445927 0.370569 1.385273 0.683506 -0.488000 -0.706634 -1.132210 0.511413 -0.283276 0.880093 -0.649611 0.151899 -1.542229 -1.605936 -0.906198 -0.053110 -0.934906 0.001504 -0.473969 1.757605 2.094833 0.971821 -0.007963 -0.023728 0.814541 -0.129355 0.693643 -0.897521 -0.978683 -1.263389 -0.181939 -0.775093 1.018393 -1.055206 -0.680719 -1.965743 -1.194372 -0.714936 1.176819 1.565398 -0.478578 0.213794 1.014696 1.715941 1.303775 0.545193 0.254092 1.093222 0.471941 0.928536 -0.844484 -1.049744 -1.417446 0.739309 -0.474725 2.019905 -1.585276 -0.757766 -1.392234 -1.344177 -0.918719 2.266661 1.790530 -0.512430 -0.642730 -0.236481 0.982732 0.213872 -0.225743 0.895681 0.005032 0.478152 0.755620 -1.095983 -0.385233 -0.739999 0.857009 -0.331903 0.977391 -0.746958 -0.228667 -1.194845 -1.006765 -0.892299 1.827048 0.989671 0.028709 -0.141061 0.738915 1.347458 -0.427733 -0.759636 -0.054728 -0.802565 -0.130813 0.933351

Cross-Validation


source

train_ml_cv

 train_ml_cv (df, feat_col, target_col, splits, model, save=None,
              params={})

Cross-validation through the given splits

Type Default Details
df dataframe of values
feat_col feature columns
target_col target columns
splits splits
model sklearn model
save NoneType None model name to be saved, e.g., ‘LR’
params dict {} act as kwargs, for model.fit
oof,metrics = train_ml_cv(df,feat_col,target_col,splits,model)
------ fold: 0 --------
LinearRegression()
overall MSE: 0.8763
Average Pearson: 0.7168 
------ fold: 1 --------
LinearRegression()
overall MSE: 0.6406
Average Pearson: 0.7313 
------ fold: 2 --------
LinearRegression()
overall MSE: 0.7465
Average Pearson: 0.7429 
------ fold: 3 --------
LinearRegression()
overall MSE: 0.6453
Average Pearson: 0.7328 
------ fold: 4 --------
LinearRegression()
overall MSE: 0.7906
Average Pearson: 0.7263 
metrics
fold mse pearson_avg
0 0 0.876279 0.716824
1 1 0.640555 0.731331
2 2 0.746466 0.742891
3 3 0.645253 0.732798
4 4 0.790580 0.726299
# plot spearman and pearson scores
metrics.iloc[:,1:].plot.box();

# Overall score for oof
_,_,corr_df= score_each(oof,df[target_col])
overall MSE: 0.7398
Average Pearson: 0.7300 
corr_df.sort_values('Pearson').head() # lowest Pearson
Pearson
81 -0.244100
156 -0.238898
313 -0.197908
381 -0.151470
205 -0.081771

Predictor


source

predict_ml

 predict_ml (df, feat_col, target_col=None, model_pth='model.joblib')

Make predictions based on trained model.

Type Default Details
df Dataframe that contains features
feat_col feature columns
target_col NoneType None
model_pth str model.joblib

Uncheck below to run if you have model_pth:

# pred2 = predict_ml(X_test,feat_col, target_col, model_pth = 'model.joblib')
# pred2.head()
## or
# predict_ml(df.iloc[split_0[1]],feat_col,'model.joblib')
/usr/local/lib/python3.9/dist-packages/sklearn/base.py:376: InconsistentVersionWarning: Trying to unpickle estimator LinearRegression from version 1.4.1.post1 when using version 1.5.0. This might lead to breaking code or invalid results. Use at your own risk. For more info please refer to:
https://scikit-learn.org/stable/model_persistence.html#security-maintainability-limitations
  warnings.warn(
-5P -5G -5A -5C -5S -5T -5V -5I -5L -5M -5F -5Y -5W -5H -5K -5R -5Q -5N -5D -5E -5s -5t -5y -4P -4G -4A -4C -4S -4T -4V -4I -4L -4M -4F -4Y -4W -4H -4K -4R -4Q -4N -4D -4E -4s -4t -4y -3P -3G -3A -3C -3S -3T -3V -3I -3L -3M -3F -3Y -3W -3H -3K -3R -3Q -3N -3D -3E -3s -3t -3y -2P -2G -2A -2C -2S -2T -2V -2I -2L -2M -2F -2Y -2W -2H -2K -2R -2Q -2N -2D -2E -2s -2t -2y -1P -1G -1A -1C -1S -1T -1V -1I -1L -1M -1F -1Y -1W -1H -1K -1R -1Q -1N -1D -1E -1s -1t -1y 1P 1G 1A 1C 1S 1T 1V 1I 1L 1M 1F 1Y 1W 1H 1K 1R 1Q 1N 1D 1E 1s 1t 1y 2P 2G 2A 2C 2S 2T 2V 2I 2L 2M 2F 2Y 2W 2H 2K 2R 2Q 2N 2D 2E 2s 2t 2y 3P 3G 3A 3C 3S 3T 3V 3I 3L 3M 3F 3Y 3W 3H 3K 3R 3Q 3N 3D 3E 3s 3t 3y 4P 4G 4A 4C 4S 4T 4V 4I 4L 4M 4F 4Y 4W 4H 4K 4R 4Q 4N 4D 4E 4s 4t 4y 0s 0t 0y
3 -0.364707 0.870501 0.991939 -1.122006 -0.290667 -0.833366 -0.307455 -0.324753 0.471472 -1.073353 -0.812464 -1.434765 -1.350083 -1.271137 1.800298 0.974743 0.116519 0.744288 1.985861 2.331726 -0.413103 -0.471491 -0.217838 -0.247941 1.292480 0.244995 -0.891201 -0.536663 -0.804740 0.428205 -0.086208 1.264944 -1.158690 -0.408965 -1.493815 -1.168097 -0.991530 0.758574 1.455711 -0.190510 -0.031299 1.957698 1.849005 0.213270 -0.796449 -0.658697 -0.061393 0.335216 1.229836 -1.199328 -0.880915 -1.180625 0.474172 -0.231705 1.040083 -1.320546 -0.500661 -1.444532 -1.220846 -0.539033 1.032884 -0.167849 0.402668 0.142277 1.606714 2.048386 0.609752 -0.409203 0.234526 -0.374300 0.595023 0.711315 -0.947360 -0.875062 -1.197055 -0.373876 -0.127561 0.149932 -1.376348 -0.821782 -1.700367 -1.061799 -1.077019 0.975288 1.281217 -0.216851 0.232975 2.334243 2.257738 0.741782 -0.343032 1.212808 -0.721960 0.433344 0.901258 -0.993851 -0.923428 -1.240503 0.247750 0.237647 1.308000 -1.448115 -0.210287 -1.602614 -1.224206 -0.771882 0.307644 -0.361465 0.193351 0.875552 2.765148 1.398163 0.031033 -0.140698 0.940186 -2.415734 1.794049 1.064599 -0.665642 -1.398878 -1.206198 2.164858 1.292426 0.896056 -1.059209 -0.072301 -1.666331 -1.448606 -0.671760 0.142948 -0.818149 0.341993 -0.693898 1.257253 2.420482 0.679768 -0.467398 0.529559 -0.201342 -0.046525 1.260589 -0.405474 -1.213921 -1.201158 0.671173 -0.063247 1.122971 -0.608822 0.330831 -1.727682 -1.080166 -0.310707 0.212629 0.484047 0.237114 0.537202 1.294311 1.613692 0.065542 -0.407601 -0.563342 -0.303835 0.010657 0.690046 -0.584377 -1.130737 -1.086326 1.374406 1.106638 3.509956 -0.697685 -0.028091 -1.351033 -1.100004 -0.896113 0.106874 0.872651 -0.538471 -0.628911 0.592987 0.662206 -0.040950 -0.359280 -0.180524 0.471878 0.807249 0.916216 -1.005140 -0.735050 -0.841288 0.429646 -0.462429 0.806671 -0.801938 -0.263505 -1.245028 -0.900780 -0.736740 1.254654 0.862569 -0.011711 0.108451 0.909329 1.577520 -0.199625 -0.255603 -0.685361 -0.775611 -0.584843 1.360464
8 0.094880 0.943445 0.534657 -1.164017 -0.481484 -0.721019 0.194848 -0.108135 0.560754 -1.158064 -0.261648 -1.330354 -1.198559 -0.973158 2.034347 0.765188 0.708631 0.480957 1.458316 1.806352 -0.680310 -0.982514 -0.522106 -0.020348 1.502482 0.386448 -0.937715 -0.552971 -0.670327 0.616476 -0.048377 1.117217 -0.955937 -0.441204 -1.386428 -0.925580 -0.690133 1.070817 0.465711 -0.054884 -0.092478 1.455645 1.364479 0.227138 -0.579721 -0.849840 -0.082119 0.733043 1.242609 -0.987145 -0.754413 -0.807264 0.640982 0.088161 1.170935 -0.964818 -0.627630 -1.389539 -1.428106 -0.675190 0.566198 0.060587 0.067555 0.200230 1.412501 1.655164 0.429707 -0.241574 -0.310633 0.579110 0.684921 0.773356 -1.068141 -0.861840 -1.025491 0.332373 0.291397 0.352253 -1.216874 -0.980455 -1.923013 -1.348364 -1.029345 1.264398 0.704389 -0.053480 0.238599 1.530757 2.148098 0.480477 -0.206869 0.333178 -1.037978 -0.191798 0.646448 -1.011805 -1.106884 -1.010696 0.660367 0.684997 2.409452 -1.373009 -0.193996 -1.702466 -1.257548 -0.950701 0.281967 -0.061367 -0.516400 0.883593 2.911111 1.839703 -0.180538 0.028359 0.249610 -0.422202 1.516919 1.048154 -0.699110 -1.233507 -1.222338 1.272061 0.667691 0.365978 -1.238067 -0.441393 -1.600342 -1.126943 -0.805258 0.035632 -0.748474 0.394493 -0.745987 1.498071 3.478439 0.456116 -0.345265 -0.105351 -0.084348 -0.025478 0.499899 -0.922378 -1.031616 -1.100205 1.003919 0.132450 0.733760 -0.845991 0.004665 -1.607834 -1.262301 -0.588094 0.845450 -0.526361 -0.376046 -0.449336 2.384711 2.438319 0.197771 -0.255833 0.835590 -0.018200 0.523772 0.851705 -0.650950 -0.949908 -0.899393 0.842180 0.457261 2.260546 -0.807861 -0.017885 -1.238006 -0.988591 -0.562653 -0.139013 0.505947 -0.094941 -0.265339 1.148561 0.760380 -0.330286 -0.420700 0.033910 0.115320 0.928882 0.903340 -0.970370 -0.439861 -0.784085 0.780607 -0.194983 0.927073 -0.812121 -0.128218 -1.170375 -0.873047 -0.869217 0.933201 0.325269 -0.203590 -0.001098 1.353726 1.829347 -0.594127 -0.717501 -0.338266 -0.582361 -0.741572 1.323994
10 0.563761 1.253751 1.571941 -1.588473 -0.696936 -0.664023 -0.333556 -1.140509 -0.301241 -1.331894 -1.583751 -1.799943 -1.198313 0.126797 2.049657 0.708267 0.193999 0.542017 2.315381 2.606185 -0.373873 -0.612161 -0.308238 1.072220 1.933363 0.674664 -0.953069 -0.506297 -0.833425 0.014233 -0.595364 0.767623 -0.874673 -0.677423 -1.582824 0.235657 -0.751471 0.786653 0.314812 -0.207874 -0.064091 2.431409 1.897538 -0.412316 -1.698755 -0.971129 1.643479 1.130139 1.287069 -1.394417 -0.554439 -0.749204 0.208566 -0.510789 1.248382 -0.708603 0.044584 -1.256586 -1.300583 -1.083200 0.051696 0.698768 0.279998 0.156924 1.561303 2.185819 -0.499524 -1.337618 -1.100894 1.097608 2.030696 1.023373 -1.308945 -0.552031 -0.950460 -0.195131 -0.133730 0.804491 -0.752654 -1.092412 -1.783758 -1.628648 -0.320471 0.668794 0.223579 0.171435 0.411966 1.749369 2.801345 -0.002586 -1.255979 -1.005202 0.649796 1.050949 1.291494 -1.152493 -0.810573 -0.993410 -0.093676 -0.321893 1.024643 -0.824132 -0.191682 -1.429931 -1.442146 -0.175457 -0.192891 -0.423366 0.585401 1.027386 1.929529 2.274905 -1.160454 -1.327223 0.704741 -0.083104 0.854169 1.289764 -0.853324 -0.820734 -0.842118 2.699562 1.306493 0.595354 -0.849920 0.032008 -1.222941 -1.340690 -0.424373 -0.135501 -1.276363 0.901684 0.068890 0.878638 1.679665 -1.356673 -2.119139 1.019436 0.438966 0.950958 1.399998 -0.556097 -0.963540 -1.033353 0.312167 0.051301 0.725654 -0.624537 0.215192 -1.330343 -0.601941 -0.893214 -0.259996 -1.114820 0.681553 0.779848 3.042752 2.135981 -1.304259 -1.777748 -0.275344 1.312742 1.131495 0.727130 -0.734200 -0.961843 -1.039254 1.119374 1.033973 3.468593 -0.263437 0.228164 -1.789694 -1.532912 -1.562411 -1.045123 -0.540854 -0.354496 0.223732 0.778495 1.096361 -0.342493 -0.798788 -0.155172 1.346672 0.894495 1.049472 -1.305194 -0.674273 -0.970410 0.130179 -0.306987 1.077974 -0.767324 -0.022978 -1.338590 -1.357309 -0.905156 0.892107 1.195717 -0.349027 0.054265 1.101195 1.518123 -0.623193 -0.590878 -0.048774 -0.823091 -0.971931 1.794952
19 -0.345552 0.793362 1.340005 -1.013930 -0.150572 -0.959579 0.355774 0.541276 1.640309 -0.968219 -0.745433 -1.560300 -1.612285 -0.562676 1.528894 0.152422 0.117465 -0.150604 2.011297 1.979388 -0.541172 -1.077355 -0.772326 -0.117310 1.887614 0.863831 -0.919621 -0.601378 -1.020209 0.254741 -0.131939 1.382369 -0.552455 -0.078917 -1.364130 -1.019803 -1.212733 0.800430 1.010493 -0.530314 -0.166232 1.640520 1.809336 0.019236 -0.892022 -1.061420 -0.185721 1.274965 1.841385 -1.456838 -0.913036 -1.139958 0.558990 -0.154651 1.591975 -1.045138 -0.098213 -1.351432 -0.968574 -1.044251 1.314673 0.093657 0.163790 -0.215422 0.901634 2.001833 0.411230 -0.925994 -0.655048 0.179535 1.121484 0.831897 -1.426318 -0.869830 -1.135444 0.344419 0.164675 0.146213 -1.291119 -0.544836 -1.222516 -0.747992 -1.042659 1.757903 0.196469 -0.157414 -0.090297 1.723915 2.310838 0.441897 -0.805656 0.114729 -0.369300 0.002636 1.153028 -0.793749 -0.962975 -0.882189 1.532851 1.159654 2.773061 -0.995763 0.417408 -1.197206 -0.784852 -1.019147 0.785857 -1.035980 0.028651 -0.038599 1.170523 0.420770 -0.599012 -0.156600 -0.608985 -1.650676 1.383706 1.177061 -1.034139 -0.847366 -1.013153 1.911604 0.737674 0.344927 -1.112050 0.292221 -2.019246 -1.670726 -0.601921 0.583629 -0.544380 0.610869 0.281789 1.494896 3.112953 -0.204715 -1.194006 -0.039082 -0.302105 0.591894 1.178216 -1.460978 -1.177531 -1.262566 1.103796 0.439793 1.534381 -0.941557 0.443578 -1.412128 -0.858665 -1.099606 -0.343484 0.118135 -0.132155 0.061301 1.652438 2.183049 -0.581653 -1.165723 1.431706 0.049749 0.418179 0.651238 -1.012705 -1.228785 -1.290870 1.634930 1.323030 3.739850 -0.531252 0.441585 -1.730310 -1.325847 -1.216878 -0.131402 0.612972 -1.021584 -0.950885 0.839756 1.268975 -0.181989 -0.366393 0.008736 0.311216 1.188935 1.496079 -1.064284 -0.815594 -1.064188 1.159056 -0.260094 1.226376 -0.641610 -0.407489 -1.626012 -1.263492 -0.796787 0.830258 0.066362 -0.052483 0.005619 1.219623 1.779513 -0.375856 -0.839078 -0.076090 -0.847041 -0.717059 1.564111
24 0.176036 0.741841 0.992534 -1.113998 -0.412594 -0.791905 0.128339 -0.202013 -0.113531 -1.339265 -0.623999 -1.806253 -1.556014 -1.377770 1.705188 0.344101 0.013864 -0.063015 2.441017 2.226830 0.418280 0.113436 0.100260 -0.527241 0.933997 0.392464 -0.941927 -0.704338 -0.888078 0.383058 -0.311264 0.793709 -1.238518 -0.678181 -1.692939 -1.848728 -0.866667 0.882126 1.105633 -0.170399 -0.146590 2.465192 2.227812 0.838558 -0.005733 -0.001308 0.318518 0.262553 0.795710 -1.192415 -0.700949 -0.840174 0.151848 -0.585192 0.623978 -1.377972 -0.448464 -1.331725 -1.555744 -0.634792 0.984878 -1.451646 0.483077 0.390429 2.449402 3.598647 0.403643 -0.588677 0.244037 0.115809 0.767234 0.312291 -1.280598 -0.925389 -1.180273 -0.292559 -0.632486 -0.248184 -1.599685 -0.992676 -1.182922 -1.608597 -0.716648 0.951788 -0.070659 -1.043563 0.817152 3.070715 3.075835 1.162553 0.404749 1.095346 -0.776021 0.128904 0.297380 -0.936568 -1.385809 -1.284660 0.886518 1.251424 1.902713 -1.655345 -0.686997 -2.020636 -1.894705 -0.874707 -0.025398 -0.307371 -0.372325 0.024096 2.315388 1.802032 1.030665 1.136667 1.445327 0.370679 1.385250 0.683664 -0.488174 -0.706762 -1.132477 0.511686 -0.283241 0.879578 -0.650349 0.152010 -1.542692 -1.606455 -0.906367 -0.053040 -0.934305 0.001645 -0.474069 1.757539 2.094547 0.972126 -0.007798 -0.023923 0.814214 -0.129828 0.694122 -0.897630 -0.978841 -1.263475 -0.181653 -0.774467 1.019136 -1.054700 -0.680229 -1.965019 -1.193777 -0.715535 1.175966 1.564713 -0.478562 0.213456 1.014210 1.716004 1.303603 0.545162 0.254100 1.093431 0.472019 0.928594 -0.844668 -1.049860 -1.417495 0.739762 -0.474344 2.020428 -1.585106 -0.757616 -1.392456 -1.344226 -0.919267 2.266412 1.789183 -0.512398 -0.642769 -0.236152 0.983508 0.213989 -0.225691 0.895452 0.005392 0.478243 0.755899 -1.096258 -0.385456 -0.740180 0.857478 -0.331416 0.978397 -0.747085 -0.228418 -1.195601 -1.007037 -0.892575 1.826930 0.989540 0.028526 -0.141447 0.738887 1.347976 -0.427631 -0.759316 -0.054978 -0.802389 -0.131039 0.933512