Extracts the prognostic_info
list element from an rctglm_prog
object. See
'Value' at rctglm_with_prognosticscore for more details.
Arguments
- x
an object of class
rctglm_prog
(returned by rctglm_with_prognosticscore)
Value
a list with the structure described of prognostic_info
in the
Value
section of rctglm_with_prognosticscore.
See also
The generic rctglm_with_prognosticscore()
for which this method
works.
Examples
# Generate some data
n <- 100
b0 <- 1
b1 <- 1.5
b2 <- 2
W1 <- runif(n, min = 1, max = 10)
exposure_prob <- .5
dat_treat <- glm_data(
Y ~ b0+b1*log(W1)+b2*A,
W1 = W1,
A = rbinom(n, 1, exposure_prob)
)
dat_notreat <- glm_data(
Y ~ b0+b1*log(W1),
W1 = W1
)
learners <- list(
mars = list(
model = parsnip::set_engine(
parsnip::mars(
mode = "regression", prod_degree = 3
),
"earth"
)
)
)
ate <- rctglm_with_prognosticscore(
formula = Y ~ .,
exposure_indicator = A,
exposure_prob = exposure_prob,
data = dat_treat,
family = gaussian(),
estimand_fun = "ate",
data_hist = dat_notreat,
learners = learners)
#>
#> ── Fitting prognostic model ──
#>
#> ℹ Created formula for fitting prognostic model as: Y ~ .
#> ℹ Fitting learners
#> • mod_mars
#> i No tuning parameters. `fit_resamples()` will be attempted
#> i 1 of 1 resampling: mod_mars
#> ✔ 1 of 1 resampling: mod_mars (297ms)
#> ℹ Model with lowest RMSE: mod_mars
#> ℹ Investigate trained learners and fitted model in `prognostic_info` list element
#>
#> ── Symbolic differentiation of estimand function ──
#>
#> ℹ Symbolically deriving partial derivative of the function 'psi1 - psi0' with respect to 'psi0' as: '-1'.
#> • Alternatively, specify the derivative through the argument
#> `estimand_fun_deriv0`
#> ℹ Symbolically deriving partial derivative of the function 'psi1 - psi0' with respect to 'psi1' as: '1'.
#> • Alternatively, specify the derivative through the argument
#> `estimand_fun_deriv1`
prog(ate)
#> $formula
#> Y ~ .
#> <environment: 0x561fcedd9120>
#>
#> $model_fit
#> ══ Workflow [trained] ══════════════════════════════════════════════════════════
#> Preprocessor: Formula
#> Model: mars()
#>
#> ── Preprocessor ────────────────────────────────────────────────────────────────
#> Y ~ .
#>
#> ── Model ───────────────────────────────────────────────────────────────────────
#> Selected 2 of 8 terms, and 1 of 1 predictors
#> Termination condition: RSq changed by less than 0.001 at 8 terms
#> Importance: W1
#> Number of terms at each degree of interaction: 1 1 (additive model)
#> GCV 1.135246 RSS 105.7169 GRSq 0.282691 RSq 0.3184613
#>
#> $learners
#> $learners$mars
#> $learners$mars$model
#> MARS Model Specification (regression)
#>
#> Main Arguments:
#> prod_degree = 3
#>
#> Computational engine: earth
#>
#>
#>
#>
#> $cv_folds
#> [1] 5
#>
#> $data
#> Y W1
#> 1 3.6621214 4.104007
#> 2 4.8931947 8.282395
#> 3 2.4137504 4.080676
#> 4 5.7968296 9.948056
#> 5 4.3081985 8.054280
#> 6 2.5380249 7.743522
#> 7 1.6429072 3.100132
#> 8 2.3105753 4.175262
#> 9 2.1871016 8.202766
#> 10 3.7518964 8.146120
#> 11 5.9150520 9.287358
#> 12 1.3699347 1.978866
#> 13 3.7591325 4.584301
#> 14 5.1650980 8.229323
#> 15 4.3320181 4.734298
#> 16 -0.5007689 1.723683
#> 17 3.3452634 8.362255
#> 18 0.9292991 3.768794
#> 19 4.9152530 6.755241
#> 20 1.0954642 1.077221
#> 21 2.4474979 4.141598
#> 22 4.6433913 8.628211
#> 23 4.1404019 6.778669
#> 24 6.1254019 8.987460
#> 25 4.5214864 9.603369
#> 26 4.0504724 3.703488
#> 27 4.5763059 8.374734
#> 28 4.3357868 4.729989
#> 29 3.3444561 3.750651
#> 30 4.1465791 5.665703
#> 31 2.1578178 7.521980
#> 32 1.9609490 1.326455
#> 33 3.3738200 9.518249
#> 34 3.3538673 5.185436
#> 35 2.6333517 9.018702
#> 36 2.4942810 4.915578
#> 37 3.8007426 3.997709
#> 38 2.9718932 8.508268
#> 39 4.8894148 4.411281
#> 40 4.5558673 8.595222
#> 41 4.6488715 8.007686
#> 42 3.9956681 7.781198
#> 43 4.2139132 7.055589
#> 44 3.9063102 3.587769
#> 45 4.4792488 9.796029
#> 46 4.7076808 7.450139
#> 47 2.5515069 3.410406
#> 48 1.2758184 1.270140
#> 49 2.4536496 6.023307
#> 50 3.6064859 4.992764
#> 51 4.0638982 6.226682
#> 52 3.5000614 4.404522
#> 53 3.3961279 3.656227
#> 54 3.2624292 8.140726
#> 55 2.4579680 3.575334
#> 56 5.1745566 9.625719
#> 57 3.1324650 4.661983
#> 58 4.9610940 9.516887
#> 59 3.7286589 7.067554
#> 60 4.2697903 3.931327
#> 61 3.6815286 7.265746
#> 62 5.5700792 9.508500
#> 63 5.1648382 5.678420
#> 64 1.8103229 2.647777
#> 65 4.7291421 9.258796
#> 66 1.8635523 3.707809
#> 67 2.5569000 1.890157
#> 68 5.3924140 9.694692
#> 69 5.3828237 7.690062
#> 70 4.2042933 2.429835
#> 71 4.5110811 3.409121
#> 72 2.4829185 4.966140
#> 73 4.7295659 9.106231
#> 74 3.1687328 3.879241
#> 75 2.1894319 5.245921
#> 76 2.8204314 2.613819
#> 77 3.9642139 6.715504
#> 78 3.8718682 8.996986
#> 79 4.3989075 8.250813
#> 80 2.4347479 5.188651
#> 81 3.3879994 6.797406
#> 82 3.7085525 7.321683
#> 83 2.9047902 5.802903
#> 84 3.1441057 8.895249
#> 85 2.9483110 9.480564
#> 86 2.2639374 2.476536
#> 87 3.2728221 4.507343
#> 88 2.5205627 2.574187
#> 89 2.0209244 3.831910
#> 90 3.5027177 7.287288
#> 91 3.1018521 6.522201
#> 92 4.9502369 8.963631
#> 93 1.8117292 4.244663
#> 94 6.2061094 7.727389
#> 95 4.7863444 1.854713
#> 96 2.5202512 8.859881
#> 97 3.7115434 3.699677
#> 98 3.7682327 7.864654
#> 99 3.4832679 4.238855
#> 100 2.4147340 7.063403
#>