laravel - Import of 50K+ Records in MySQL Gives General error: 1390 Prepared statement contains too many placeholders -
has ever come across error: general error: 1390 prepared statement contains many placeholders
i did import via sequelpro of on 50,000 records , when go view these records in view (laravel 4) general error: 1390 prepared statement contains many placeholders.
the below index() method in adminnotescontroller.php file generating query , rendering view.
public function index() { $created_at_value = input::get('created_at_value'); $note_types_value = input::get('note_types_value'); $contact_names_value = input::get('contact_names_value'); $user_names_value = input::get('user_names_value'); $account_managers_value = input::get('account_managers_value'); if (is_null($created_at_value)) $created_at_value = db::table('notes')->lists('created_at'); if (is_null($note_types_value)) $note_types_value = db::table('note_types')->lists('type'); if (is_null($contact_names_value)) $contact_names_value = db::table('contacts')->select(db::raw('concat(first_name," ",last_name) cname'))->lists('cname'); if (is_null($user_names_value)) $user_names_value = db::table('users')->select(db::raw('concat(first_name," ",last_name) uname'))->lists('uname'); // in view, there dropdown box, allows user select amount of records show per page. retrieve value or set default. $perpage = input::get('perpage', 10); // code retrieves order session has been selected user clicking on table column title. value placed in session via getorder() method , used later in eloquent query , joins. $order = session::get('account.order', 'company_name.asc'); $order = explode('.', $order); $notes_query = note::leftjoin('note_types', 'note_types.id', '=', 'notes.note_type_id') ->leftjoin('users', 'users.id', '=', 'notes.user_id') ->leftjoin('contacts', 'contacts.id', '=', 'notes.contact_id') ->orderby($order[0], $order[1]) ->select(array('notes.*', db::raw('notes.id nid'))); if (!empty($created_at_value)) $notes_query = $notes_query->wherein('notes.created_at', $created_at_value); $notes = $notes_query->wherein('note_types.type', $note_types_value) ->wherein(db::raw('concat(contacts.first_name," ",contacts.last_name)'), $contact_names_value) ->wherein(db::raw('concat(users.first_name," ",users.last_name)'), $user_names_value) ->paginate($perpage)->appends(array('created_at_value' => input::get('created_at_value'), 'note_types_value' => input::get('note_types_value'), 'contact_names_value' => input::get('contact_names_value'), 'user_names_value' => input::get('user_names_value'))); $notes_trash = note::onlytrashed() ->leftjoin('note_types', 'note_types.id', '=', 'notes.note_type_id') ->leftjoin('users', 'users.id', '=', 'notes.user_id') ->leftjoin('contacts', 'contacts.id', '=', 'notes.contact_id') ->orderby($order[0], $order[1]) ->select(array('notes.*', db::raw('notes.id nid'))) ->get(); $this->layout->content = view::make('admin.notes.index', array( 'notes' => $notes, 'created_at' => db::table('notes')->lists('created_at', 'created_at'), 'note_types' => db::table('note_types')->lists('type', 'type'), 'contacts' => db::table('contacts')->select(db::raw('concat(first_name," ",last_name) cname'))->lists('cname', 'cname'), 'accounts' => account::lists('company_name', 'company_name'), 'users' => db::table('users')->select(db::raw('concat(first_name," ",last_name) uname'))->lists('uname', 'uname'), 'notes_trash' => $notes_trash, 'perpage' => $perpage )); }
any advice appreciated. thanks.
while think @the disintegrator correct placeholders being limited. not run 1 query per record.
i have query worked fine until added 1 more column , have 72k placeholders , error. however, 72k made of 9000 rows 8 columns. running query 1 record @ time take days. (i'm trying import adwords data db , literally take more 24 hours import days worth of data if did 1 record @ time. tried first.)
what recommend of hack. first either dynamically determine max number of placeholders want allow - i.e. 60k safe. use number determine, based on number of columns, how many complete records can import/return @ once. create full array of data query. use array_chunk , foreach loop grab want in minimum number of queries. this:
$maxrecords = 1000; $sql = 'select * ...'; $qmarks = array_fill(0, $maxinsert, '(?, ...)'); $tmp = $sql . $implode(', ', $qmarks); foreach (array_chunk($data, $maxrecords) $junk=>$dataarray) { if (count($dataarray) < $maxrecords)) { break; } // pdo stuff here using $tmp sql statement placeholders - ?s } // insert leftovers same code above except accounting // fact have fewer $maxrecords now.
Comments
Post a Comment