It didn’t seem to work, but I didn’t mess around too much since I found an alternative work around. Though I am wondering if the workaround will add too much overhead to the website?
[php]function ck_func($atts, $content = “defaultvalue”){
//This extracts the type field from what is being passed inside the brackets in the shortcode. It is an array so you can pass multiple things. You would just add them below under type.
extract( shortcode_atts( array(
‘type’ => ‘sampledefaulttype’,
), $atts ) );
//this puts the contents of the URL into a string note the dollar sign and what follows, that takes what is passed in the middle of the short code and includes it as part of the link.
$jsonbad = file_get_contents(“https://affiliate.creditkarma.com/api/v1/offers/card/$content-json?pubKey=9LJJ2TMZ3WE9M923”);
//the following strips out an open and close bracket in ck’s json code that seems to cause problems
$jsonstillbad = str_replace(’“offers”:[{’, ‘“offers”:{’, $jsonbad);
$json = str_replace(’}}]}’, ‘}}}’, $jsonstillbad);
// the following puts the decoded json into the obj string
$obj = json_decode($json,true);
//the following returns the text back to the wordpress page. It first checks for two specific types that are nested deeper. If it finds those it sends them back manually. Otherwise, it sends back whatever type is in the 2nd level of nesting.
if ($type == “long”) {
return $obj[‘offers’][0][‘summary’][‘long’];
} elseif ($type == “short”) {
return $obj[‘offers’][0][‘summary’][‘short’];
} else {
return $obj[‘offers’][0]["$type"];
}
}
add_shortcode( ‘ck’, ‘ck_func’ );[/php]
BTW, is it better to use the array method or the other method for efficiency?
Also, will json requests be automatically cached or should I do something to make this more efficient? My plan is to extract 3 or 4 types of data from 20 different json uri’s on my homepage. If this doesn’t cache automatically, I would guess the overhead would be high. I am using a wordpress plugin to cache the page, though I am not sure if that helps this situation. Is there an easy way to pull down the 20 json pages once a day or is that whole extra level of programming complication that will go over my head?
Thanks!