func (s *Subscriber) getMsgAsResult(key string) *toolkit.Result { result := toolkit.NewResult() if key == "" { if len(s.messageKeys) > 0 { key = s.messageKeys[0] } } // if no key is provided, check from the latest if key == "" { result.Status = toolkit.Status_NOK result.Message = "No key has been provided to receive the message" } else { msgQue, exist := s.MessageQues[key] if !exist { result.Status = toolkit.Status_NOK result.Message = "Key " + key + " is not exist on message que or it has been collected. Available keys are: " + strings.Join(s.messageKeys, ",") } else { url := fmt.Sprintf("%s/getmsg", s.BroadcasterAddress) r, e := toolkit.HttpCall(url, "POST", toolkit.Jsonify(msgQue), nil) if e != nil { result.SetErrorTxt("Subscriber ReceiveMsg Call Error: " + e.Error()) } else if r.StatusCode != 200 { result.SetErrorTxt("Subsciber ReceiveMsg Call Error: " + r.Status) } else { //var resultMsg toolkit.Result e := toolkit.Unjson(toolkit.HttpContent(r), &result) if e != nil { result.SetErrorTxt(fmt.Sprintf("Subsciber ReceiveMsg Decode Error: ", e.Error())) } /*else { if resultMsg.Status == toolkit.Status_OK { result.Data = resultMsg.Data } else { result.SetErrorTxt(resultMsg.Message) } } */ } } } if result.Status == "OK" { url := fmt.Sprintf("%s/msgreceived") toolkit.HttpCall(url, "POST", toolkit.Jsonify(struct { Key string Subscriber string }{key, s.Address}), nil) } return result }
func (g *GetDatabase) ResultFromDatabase(dataSettingId string, out interface{}) error { c, e := dbox.NewConnection(g.desttype, &g.ConnectionInfo) if e != nil { return e } e = c.Connect() if e != nil { return e } defer c.Close() iQ := c.NewQuery() if g.CollectionSettings[dataSettingId].Collection != "" { iQ.From(g.CollectionSettings[dataSettingId].Collection) } for _, val := range g.CollectionSettings[dataSettingId].MapsColumns { iQ.Select(val.Source) } if len(g.CollectionSettings[dataSettingId].FilterCond) > 0 { iQ.Where(g.CollectionSettings[dataSettingId].filterDbox) } csr, e := iQ.Cursor(nil) if e != nil { return e } if csr == nil { return e } defer csr.Close() results := make([]toolkit.M, 0) e = csr.Fetch(&results, 0, false) if e != nil { return e } ms := []toolkit.M{} for _, val := range results { m := toolkit.M{} for _, column := range g.CollectionSettings[dataSettingId].MapsColumns { m.Set(column.Source, "") if val.Has(column.Destination) { m.Set(column.Source, val[column.Destination]) } } ms = append(ms, m) } if edecode := toolkit.Unjson(toolkit.Jsonify(ms), out); edecode != nil { return edecode } return nil }
func (g *Grabber) DataByte() []byte { d := g.Data() if toolkit.IsValid(d) { return toolkit.Jsonify(d) } return []byte{} }
func TestDelete(t *testing.T) { //t.Skip() ctx, e := prepareContext() if e != nil { t.Errorf("Error Connect: %s", e.Error()) return } defer ctx.Close() u := new(UserModel) e = ctx.GetById(u, "user2") if e == nil { fmt.Printf("Will Delete UserModel:\n %s \n", tk.JsonString(u)) e = ctx.Delete(u) if e != nil { t.Errorf("Error Load: %s", e.Error()) return } else { tk.Unjson(tk.Jsonify(u), u) fmt.Printf("UserModel: %v has been deleted \n", u.RandomDate.UTC()) fmt.Println("") } } else { t.Errorf("Delete error: %s", e.Error()) } }
func (q *Query) Cursor(in toolkit.M) (dbox.ICursor, error) { var ( e error dataMaps []toolkit.M ) q.ReadFile(&dataMaps, q.Connection().(*Connection).filePath) cursor := dbox.NewCursor(new(Cursor)) filters, e := q.Filters(in) if e != nil { return nil, errorlib.Error(packageName, modQuery, "Cursor", e.Error()) } commandType := filters.GetString("cmdType") if commandType != dbox.QueryPartSelect { return nil, errorlib.Error(packageName, modQuery, "Cursor", "Cursor is only working with select command, for "+commandType+" please use .Exec instead") } aggregate := false hasWhere := filters.Has("where") if !aggregate { var whereFields []*dbox.Filter var dataInterface interface{} json.Unmarshal(toolkit.Jsonify(dataMaps), &dataInterface) if hasWhere { whereFields = filters.Get("where").([]*dbox.Filter) // jsonSelect = fields cursor.(*Cursor).isWhere = true } cursor = cursor.SetConnection(q.Connection()) cursor.(*Cursor).whereFields = whereFields cursor.(*Cursor).jsonSelect = filters.Get("select").([]string) cursor.(*Cursor).readFile = toolkit.Jsonify(dataMaps) } else { return nil, errorlib.Error(packageName, modQuery, "Cursor", "No Aggregate function") } return cursor, nil }
func (q *Query) writeFile() error { _, e := os.Stat(q.jsonPath) if e != nil && e == os.ErrNotExist { f, e := os.Create(q.jsonPath) if e != nil { return err.Error(packageName, modQuery, "writeFile", e.Error()) } f.Close() } bs := toolkit.Jsonify(q.data) e = ioutil.WriteFile(q.jsonPath, bs, 0644) if e != nil { return err.Error(packageName, modQuery, "WriteFile", e.Error()) } return nil }
func parseHostAlias(what string, raw interface{}) string { hostAliases := []struct { IP string `json:"ip", bson:"ip"` HostName string `json:"hostName", bson:"hostName"` }{} toolkit.Unjson(toolkit.Jsonify(raw), &hostAliases) for _, alias := range hostAliases { if strings.Contains(strings.Split(what, "webhdfs")[0], alias.HostName) { what = strings.Replace(what, alias.HostName, alias.IP, 1) break } } return what }
func (s *Subscriber) AddChannel(c string) error { url := fmt.Sprintf("%s/channelregister", s.BroadcasterAddress) r, e := toolkit.HttpCall(url, "POST", toolkit.Jsonify(ChannelRegister{c, s.Address}), nil) if e != nil { return fmt.Errorf("Channel Register Call Error: %s", e.Error()) } if r.StatusCode != 200 { return fmt.Errorf("Channel Register Call Error: %s", r.Status) } result := new(toolkit.Result) e = toolkit.Unjson(toolkit.HttpContent(r), &result) if e != nil { return fmt.Errorf("Channel Register Decode error: %s", e.Error()) } if result.Status != toolkit.Status_OK { return fmt.Errorf("Channel Register error: %s", result.Message) } return nil }
func (g *Grabber) ResultFromHtml(dataSettingId string, out interface{}) error { reader := bytes.NewReader(g.bodyByte) doc, e := gq.NewDocumentFromReader(reader) if e != nil { return e } ms := []toolkit.M{} records := doc.Find(g.Config.DataSettings[dataSettingId].RowSelector) recordCount := records.Length() for i := 0; i < recordCount; i++ { record := records.Eq(i) m := toolkit.M{} for cindex, c := range g.Config.DataSettings[dataSettingId].ColumnSettings { columnId := fmt.Sprintf("%s", cindex) if c.Alias != "" { columnId = c.Alias } sel := record.Find(c.Selector) var value interface{} valuetype := strings.ToLower(c.ValueType) if valuetype == "attr" { value, _ = sel.Attr(c.AttrName) } else if valuetype == "html" { value, _ = sel.Html() } else { value = sel.Text() } value = strings.TrimSpace(fmt.Sprintf("%s", value)) m.Set(columnId, value) } if g.Config.DataSettings[dataSettingId].getCondition(m) { ms = append(ms, m) } } if edecode := toolkit.Unjson(toolkit.Jsonify(ms), out); edecode != nil { return edecode } return nil }
func (m *MessageMonitor) ditributeBroadcast() { //for len(m.Targets) != m.Success && time.Now().After(m.Expiry) == false { wg := new(sync.WaitGroup) for k, t := range m.Targets { wg.Add(1) go func(wg *sync.WaitGroup, k int, t string) { defer wg.Done() if m.Status[k] != "OK" { var command, url string if m.Command != "" { command = m.Command } else { command = "msg" } url = fmt.Sprintf("http://%s/%s", t, command) r, ecall := toolkit.HttpCall(url, "POST", toolkit.Jsonify(Message{Key: m.Key, Data: m.Data, Expiry: m.Expiry}), nil) if ecall != nil { m.setSuccessFail(k, "CALL ERROR: "+url+" ERR:"+ecall.Error()) } else if r.StatusCode != 200 { m.setSuccessFail(k, fmt.Sprintf("CALL STATUS ERROR: %s ERR: %s", url, r.Status)) } else { var result toolkit.Result bs := toolkit.HttpContent(r) edecode := toolkit.Unjson(bs, &result) if edecode != nil { m.setSuccessFail(k, "DECODE ERROR: "+string(bs)+" ERR:"+edecode.Error()) } else { m.setSuccessFail(k, toolkit.IfEq(result.Status, toolkit.Status_OK, "OK", result.Message).(string)) } } } }(wg, k, t) } wg.Wait() //time.Sleep(1 * time.Millisecond) //fmt.Printf("%d = %d \n", len(m.Targets), m.Success+m.Fail) //} }
func (m *MessageMonitor) distributeQue() { //--- inform all targets that new message has been created msg := toolkit.Jsonify(Message{Key: m.Key, Data: m.Data, Expiry: m.Expiry}) wg := new(sync.WaitGroup) targetCount := len(m.Targets) var newtargets []string var failtargets []string for _, t := range m.Targets { wg.Add(1) go func(wg *sync.WaitGroup, t string) { defer wg.Done() url := fmt.Sprintf("%s://%s/newkey", "http", t) r, e := toolkit.HttpCall(url, "POST", msg, nil) if e != nil { m.Broadcaster.Log().Warning(fmt.Sprintf( "Unable to inform %s for new que %s. %s", url, m.Key, e.Error())) failtargets = append(failtargets, t) } else if r.StatusCode != 200 { m.Broadcaster.Log().Warning(fmt.Sprintf( "Unable to inform %s for new que %s %s", url, m.Key, r.Status)) failtargets = append(failtargets, t) } else { newtargets = append(newtargets, t) } }(wg, t) } wg.Wait() m.Targets = newtargets m.Broadcaster.Log().Info(fmt.Sprintf("Ping %d servers for new message %s. Succcess: %d Fail: %d", targetCount, m.Key, len(newtargets), len(failtargets))) //-- loop while not all target complete receival or expire for len(m.Targets) != m.Success && time.Now().After(m.Expiry) == false { time.Sleep(1 * time.Millisecond) } }
func (q *Query) Exec(in toolkit.M) error { setting, e := q.prepare(in) commandType := setting["commandtype"].(string) if e != nil { return err.Error(packageName, modQuery, "Exec: "+commandType, e.Error()) } if setting.GetString("commandtype") == dbox.QueryPartSelect { return err.Error(packageName, modQuery, "Exec: "+commandType, "Exec is not working with select command, please use .Cursor instead") } q.Lock() defer q.Unlock() var dataM toolkit.M var dataMs []toolkit.M hasData := in.Has("data") dataIsSlice := false data := in.Get("data") if toolkit.IsSlice(data) { dataIsSlice = true e = toolkit.Unjson(toolkit.Jsonify(data), dataMs) if e != nil { return err.Error(packageName, modQuery, "Exec: "+commandType, "Data encoding error: "+e.Error()) } } else { dataM, e = toolkit.ToM(data) dataMs = append(dataMs, dataM) if e != nil { return err.Error(packageName, modQuery, "Exec: "+commandType, "Data encoding error: "+e.Error()) } } hasWhere := in.Has("where") where := in.Get("where", []*dbox.Filter{}).([]*dbox.Filter) if hasData && hasWhere == false && toolkit.HasMember([]interface{}{dbox.QueryPartInsert, dbox.QueryPartUpdate, dbox.QueryPartSave}, commandType) { hasWhere = true if toolkit.IsSlice(data) { ids := []interface{}{} idField := "" if idField == "" { return err.Error(packageName, modQuery, "Exec:"+commandType, "Data send is a slice, but its element has no ID") } dataCount := toolkit.SliceLen(data) for i := 0; i < dataCount; i++ { dataI := toolkit.SliceItem(data, i) if i == 0 { idField = toolkit.IdField(dataI) } ids = append(ids, toolkit.Id(dataI)) } where = []*dbox.Filter{dbox.In(idField, ids)} } else { id := toolkit.Id(data) if toolkit.IsNilOrEmpty(id) { where = []*dbox.Filter{dbox.Eq(toolkit.IdField(id), id)} } else { where = nil hasWhere = false } } } q.openFile() if commandType == dbox.QueryPartInsert { if !hasData { return err.Error(packageName, modQuery, "Exec:"+commandType, "Data is empty") } if dataIsSlice { q.data = append(q.data, dataMs...) } else { q.data = append(q.data, dataM) } } else if commandType == dbox.QueryPartUpdate { if !hasData { return err.Error(packageName, modQuery, "Exec:"+commandType, "Data is empty") } var indexes []interface{} if hasWhere { toolkit.Serde(dbox.Find(q.data, where), &indexes, "") } var dataUpdate toolkit.M var updateDataIndex int isDataSlice := toolkit.IsSlice(data) if isDataSlice == false { isDataSlice = false e = toolkit.Serde(data, &dataUpdate, "") if e != nil { return err.Error(packageName, modQuery, "Exec:"+commandType, "Unable to serialize data. "+e.Error()) } } var idField string for i, v := range q.data { if toolkit.HasMember(indexes, i) || len(indexes) == 0 { if idField == "" { idField = toolkit.IdField(v) if idField == "" { return err.Error(packageName, modQuery, "Exec:"+commandType, "No ID") } } var dataOrigin toolkit.M e = toolkit.Serde(v, &dataOrigin, "") if e != nil { return err.Error(packageName, modQuery, "Exec:"+commandType, "Unable to serialize data origin. "+e.Error()) } if isDataSlice { e = toolkit.Serde(toolkit.SliceItem(data, updateDataIndex), &dataUpdate, "") if e != nil { return err.Error(packageName, modQuery, "Exec:"+commandType, "Unable to serialize data. "+e.Error()) } updateDataIndex++ } for fieldName, fieldValue := range dataUpdate { if fieldName != idField { if dataOrigin.Has(fieldName) { dataOrigin.Set(fieldName, fieldValue) } } } toolkit.Serde(dataOrigin, &v, "") q.data[i] = v } } } else if commandType == dbox.QueryPartDelete { if hasWhere { var indexes []interface{} toolkit.Serde(dbox.Find(q.data, where), &indexes, "") if len(indexes) > 0 { newdata := []toolkit.M{} for index, v := range q.data { if toolkit.HasMember(indexes, index) == false { newdata = append(newdata, v) } } q.data = newdata } } else { q.data = []toolkit.M{} } } else if commandType == dbox.QueryPartSave { if !hasData { return err.Error(packageName, modQuery, "Exec:"+commandType, "Data is empty") } } q.writeFile() return nil }
func (c *Cursor) Fetch(m interface{}, n int, closeWhenDone bool) error { if closeWhenDone { defer c.Close() } // if !toolkit.IsPointer(m) { // return errorlib.Error(packageName, modCursor, "Fetch", "Model object should be pointer") // } if n != 1 && reflect.ValueOf(m).Elem().Kind() != reflect.Slice { return errorlib.Error(packageName, modCursor, "Fetch", "Model object should be pointer of slice") } e := c.prepIter() if e != nil { return errorlib.Error(packageName, modCursor, "Fetch", e.Error()) } datas := []toolkit.M{} // lineCount := 0 //============================= maxGetData := c.count if n > 0 { maxGetData = c.fetchRow + n } linecount := 0 for _, row := range c.reader.Sheet[c.sheetname].Rows { isAppend := true recData := toolkit.M{} appendData := toolkit.M{} for i, cell := range row.Cells { if i < len(c.headerColumn) { recData.Set(c.headerColumn[i].name, cell.Value) if len(c.ConditionVal.Select) == 0 || c.ConditionVal.Select.Get("*", 0).(int) == 1 { appendData.Set(c.headerColumn[i].name, cell.Value) } else { if c.ConditionVal.Select.Get(c.headerColumn[i].name, 0).(int) == 1 { appendData.Set(c.headerColumn[i].name, cell.Value) } } } } isAppend = c.ConditionVal.getCondition(recData) if c.fetchRow < c.ConditionVal.skip || (c.fetchRow > (c.ConditionVal.skip+c.ConditionVal.limit) && c.ConditionVal.limit > 0) { isAppend = false } if isAppend && len(appendData) > 0 { linecount += 1 if linecount > c.fetchRow { datas = append(datas, appendData) c.fetchRow += 1 } } if c.fetchRow >= maxGetData { break } } e = toolkit.Unjson(toolkit.Jsonify(datas), m) if e != nil { return errorlib.Error(packageName, modCursor, "Fetch", e.Error()) } return nil }
func (q *Query) ExecOut(parm toolkit.M) (int64, error) { var e error if parm == nil { parm = toolkit.M{} } driverName := q.GetDriverDB() // driverName = "oracle" tablename := "" data := parm.Get("data") var attributes string var values string var setUpdate string var dataM toolkit.M var dataMs []toolkit.M var returnId int64 if toolkit.IsSlice(data) { e = toolkit.Unjson(toolkit.Jsonify(data), &dataMs) if e != nil { return returnId, errorlib.Error(packageName, modQuery, "Exec: data extraction", "Data encoding error: "+e.Error()) } } else { dataM, e = toolkit.ToM(data) dataMs = append(dataMs, dataM) if e != nil { return returnId, errorlib.Error(packageName, modQuery, "Exec: data extraction", "Data encoding error: "+e.Error()) } } for _, dataVal := range dataMs { temp := "" quyerParts := q.Parts() c := crowd.From(&quyerParts) groupParts := c.Group(func(x interface{}) interface{} { qp := x.(*dbox.QueryPart) temp = toolkit.JsonString(qp) return qp.PartType }, nil).Exec() parts := map[interface{}]interface{}{} if len(groupParts.Result.Data().([]crowd.KV)) > 0 { for _, kv := range groupParts.Result.Data().([]crowd.KV) { parts[kv.Key] = kv.Value } } commandType := "" _, hasDelete := parts[dbox.QueryPartDelete] _, hasInsert := parts[dbox.QueryPartInsert] _, hasUpdate := parts[dbox.QueryPartUpdate] _, hasSave := parts[dbox.QueryPartSave] if hasDelete { commandType = dbox.QueryPartDelete } else if hasInsert { commandType = dbox.QueryPartInsert } else if hasUpdate { commandType = dbox.QueryPartUpdate } else if hasSave { commandType = dbox.QueryPartSave } if hasInsert || hasUpdate || hasSave { attributes, setUpdate, values = extractData(dataVal, driverName) } else if hasDelete { } fromParts, hasFrom := parts[dbox.QueryPartFrom] if !hasFrom { return returnId, errorlib.Error(packageName, "Query", modQuery, "Invalid table name") } tablename = fromParts.([]*dbox.QueryPart)[0].Value.(string) var where interface{} whereParts, hasWhere := parts[dbox.QueryPartWhere] if hasWhere { fb := q.Connection().Fb() for _, p := range whereParts.([]*dbox.QueryPart) { fs := p.Value.([]*dbox.Filter) for _, f := range fs { fb.AddFilter(f) } } where, e = fb.Build() if e != nil { } else { } } var id string var idVal interface{} if where == nil { id, idVal = toolkit.IdInfo(dataVal) if id != "" { where = id + " = " + StringValue(idVal, "non") } } session := q.Session() sessionHive := q.SessionHive() if commandType == dbox.QueryPartInsert { if attributes != "" && values != "" { var statement string if driverName == "hive" { statement = "INSERT INTO " + tablename + " VALUES " + values e = sessionHive.Exec(statement, nil) } else { statement = "INSERT INTO " + tablename + " " + attributes + " VALUES " + values var res sql.Result res, e = session.Exec(statement) if res != nil { returnId, _ = res.LastInsertId() } } if e != nil { return returnId, errorlib.Error(packageName, modQuery+".Exec", commandType, cast.ToString(e.Error())) } } else { return returnId, errorlib.Error(packageName, modQuery+".Exec", commandType, "please provide the data") } } else if commandType == dbox.QueryPartUpdate { if setUpdate != "" { var statement string if where != nil { statement = "UPDATE " + tablename + " SET " + setUpdate + " WHERE " + cast.ToString(where) } else { statement = "UPDATE " + tablename + " SET " + setUpdate } if driverName == "hive" { e = sessionHive.Exec(statement, nil) } else { _, e = session.Exec(statement) } if e != nil { return returnId, errorlib.Error(packageName, modQuery+".Exec", commandType, cast.ToString(e.Error())) } } else { return returnId, errorlib.Error(packageName, modQuery+".Exec", commandType, "please provide the data") } } else if commandType == dbox.QueryPartDelete { var statement string if where != nil { statement = "DELETE FROM " + tablename + " where " + cast.ToString(where) } else { statement = "DELETE FROM " + tablename } if driverName == "hive" { e = sessionHive.Exec(statement, nil) } else { _, e = session.Exec(statement) } if e != nil { return returnId, errorlib.Error(packageName, modQuery+".Exec", commandType, cast.ToString(e.Error())) } } else if commandType == dbox.QueryPartSave { if attributes != "" && values != "" { var querystmt string if where != nil { querystmt = "select 1 as data from " + tablename + " where " + cast.ToString(where) } var rowCount int if driverName == "hive" { rowCount = 0 // row := sessionHive.Exec(querystmt, nil) // rowCount = toolkit.ToInt(row[0], "auto") } else { if querystmt != "" { rows, _ := session.Query(querystmt) for rows.Next() { rows.Scan(&rowCount) } } } var statement string if rowCount == 0 || where == nil { if driverName == "hive" { statement = "INSERT INTO " + tablename + " VALUES " + values } else { statement = "INSERT INTO " + tablename + " " + attributes + " VALUES " + values } } else { statement = "UPDATE " + tablename + " SET " + setUpdate + " WHERE " + cast.ToString(where) } if driverName == "hive" { e = sessionHive.Exec(statement, nil) } else { _, e = session.Exec(statement) } if e != nil { return returnId, errorlib.Error(packageName, modQuery+".Exec", commandType, cast.ToString(e.Error())) } } else if values == "" { return returnId, errorlib.Error(packageName, modQuery+".Exec", commandType, "please provide the data") } } if e != nil { return returnId, errorlib.Error(packageName, modQuery+".Exec", commandType, e.Error()) } } return returnId, nil }
func (q *Query) insertBulk(parm toolkit.M) error { var e error if parm == nil { parm = toolkit.M{} } driverName := q.GetDriverDB() // driverName = "oracle" tablename := "" data := parm.Get("data") var attributes string var dataM toolkit.M var dataMs []toolkit.M if toolkit.IsSlice(data) { e = toolkit.Unjson(toolkit.Jsonify(data), &dataMs) if e != nil { return errorlib.Error(packageName, modQuery, "Exec: data extraction", "Data encoding error: "+e.Error()) } } else { dataM, e = toolkit.ToM(data) dataMs = append(dataMs, dataM) if e != nil { return errorlib.Error(packageName, modQuery, "Exec: data extraction", "Data encoding error: "+e.Error()) } } temp := "" quyerParts := q.Parts() c := crowd.From(&quyerParts) groupParts := c.Group(func(x interface{}) interface{} { qp := x.(*dbox.QueryPart) temp = toolkit.JsonString(qp) return qp.PartType }, nil).Exec() parts := map[interface{}]interface{}{} if len(groupParts.Result.Data().([]crowd.KV)) > 0 { for _, kv := range groupParts.Result.Data().([]crowd.KV) { parts[kv.Key] = kv.Value } } commandType := "" _, hasInsert := parts[dbox.QueryPartInsert] if hasInsert { commandType = dbox.QueryPartInsert } else { _, e = q.ExecOut(parm) return e // return errorlib.Error(packageName, "Query", modQuery+".InsertBulk", "Invalid Operation") } fromParts, hasFrom := parts[dbox.QueryPartFrom] if !hasFrom { return errorlib.Error(packageName, "Query", modQuery, "Invalid table name") } tablename = fromParts.([]*dbox.QueryPart)[0].Value.(string) session := q.Session() attributeList := extractFields(dataMs[0]) var datas []string for _, dataVal := range dataMs { var values string tmp := toolkit.M{} for _, attr := range attributeList { tmp.Set(attr, dataVal.Get(attr)) } values = extractDataBulk(attributeList, tmp, driverName) // toolkit.Printf("test: \n %v \n------\n %v \n------\n %v \n------\n %v \n", attributeList, dataVal, tmp, values) datas = append(datas, values) } attributes = "(" + strings.Join(attributeList, ",") + ")" if attributes != "" && nil != datas { var statement string if driverName == "hive" { /*statement = "INSERT INTO " + tablename + " VALUES " + values e = sessionHive.Exec(statement, nil)*/ return errorlib.Error(packageName, modQuery+".Exec", commandType, "Not Implemented Yet for HIVE") } else { statement = fmt.Sprintf("INSERT INTO "+tablename+" "+attributes+" VALUES %s", strings.Join(datas, ",")) _, e = session.Exec(statement) } if e != nil { return errorlib.Error(packageName, modQuery+".Exec", commandType, cast.ToString(e.Error())) } } else { return errorlib.Error(packageName, modQuery+".Exec", commandType, "please provide the data") } return nil }
func (c *Cursor) Fetch(m interface{}, n int, closeWhenDone bool) error { if closeWhenDone { c.Close() } e := c.prepIter() if e != nil { return errorlib.Error(packageName, modCursor, "Fetch", e.Error()) } /*if c.jsonSelect == nil { return errorlib.Error(packageName, modCursor, "Fetch", "Iter object is not yet initialized") }*/ // var mData []interface{} datas := []toolkit.M{} dataJson := []toolkit.M{} dec := json.NewDecoder(strings.NewReader(string(c.readFile))) dec.Decode(&datas) if n == 0 { whereFieldsToMap, e := toolkit.ToM(c.whereFields) if e != nil { return errorlib.Error(packageName, modCursor, "Fetch", e.Error()) } b := c.getCondition(whereFieldsToMap) var foundSelected = toolkit.M{} var foundData = []toolkit.M{} var getRemField = toolkit.M{} if c.isWhere { if b { for _, v := range datas { for i, subData := range v { getRemField[i] = i //append(getRemField, i) for _, vWhere := range whereFieldsToMap { for _, subWhere := range vWhere.([]interface{}) { for _, subsubWhere := range subWhere.(map[string]interface{}) { if len(c.jsonSelect) == 0 { if strings.ToLower(subData.(string)) == strings.ToLower(subsubWhere.(string)) { dataJson = append(dataJson, v) } } else { if strings.ToLower(subData.(string)) == strings.ToLower(subsubWhere.(string)) { foundData = append(foundData, v) } } } } } } } itemToRemove := removeDuplicatesUnordered(getRemField, c.jsonSelect) if len(foundData) > 0 { var found toolkit.M for _, found = range foundData { for _, remitem := range itemToRemove { found.Unset(remitem) } dataJson = append(dataJson, found) } } } else { for _, v := range datas { for _, v2 := range v { for _, vWhere := range c.whereFields.(toolkit.M) { if reflect.ValueOf(v2).Kind() == reflect.String { if strings.ToLower(v2.(string)) == strings.ToLower(vWhere.(string)) { if len(c.jsonSelect) == 0 { dataJson = append(dataJson, v) } else { foundData = append(foundData, v) } } } } } } if len(foundData) > 0 { for _, found := range foundData { for i, subData := range found { for _, selected := range c.jsonSelect { if strings.ToLower(selected) == strings.ToLower(i) { foundSelected[i] = subData } else if selected == "*" { foundSelected[i] = subData } } } } dataJson = append(dataJson, foundSelected) } } // toolkit.Unjson(toolkit.Jsonify(dataJson), m) toolkit.Serde(dataJson, m, "json") } else { if c.jsonSelect == nil { toolkit.Unjson(toolkit.Jsonify(datas), m) } else { isSelectedFields := false for _, selectField := range c.jsonSelect { if selectField == "*" { // toolkit.Unjson(toolkit.Jsonify(datas), m) toolkit.Serde(datas, m, "json") } else { isSelectedFields = true } } if isSelectedFields { for _, v := range datas { for i, _ := range v { getRemField[i] = i } } itemToRemove := removeDuplicatesUnordered(getRemField, c.jsonSelect) for _, found := range datas { toMap := toolkit.M(found) for _, remitem := range itemToRemove { toMap.Unset(remitem) } dataJson = append(dataJson, toMap) } // toolkit.Unjson(toolkit.Jsonify(dataJson), m) toolkit.Serde(dataJson, m, "json") } } } } else if n > 0 { fetched := 0 fetching := true c.Connection().(*Connection).FetchSession() c.tempPathFile = c.Connection().(*Connection).tempPathFile ///read line fetchFile, e := os.OpenFile(c.Connection().(*Connection).tempPathFile, os.O_RDWR, 0) defer fetchFile.Close() if e != nil { return errorlib.Error(packageName, modQuery+".Exec", "Fetch file", e.Error()) } c.fetchSession = fetchFile scanner := bufio.NewScanner(fetchFile) lines := 0 for scanner.Scan() { lines++ } if lines > 0 { fetched = lines n = n + lines } for fetching { var dataM = toolkit.M{} if c.jsonSelect == nil { dataJson = append(dataJson, datas[fetched]) } else { for _, selectField := range c.jsonSelect { if selectField == "*" { dataJson = append(dataJson, datas[fetched]) } else { dataM.Set(selectField, datas[fetched][selectField]) if len(dataM) == len(c.jsonSelect) { dataJson = append(dataJson, dataM) } } } } // toolkit.Unjson(toolkit.Jsonify(dataJson), m) toolkit.Serde(dataJson, m, "json") io.WriteString(fetchFile, toolkit.JsonString(dataM)+"\n") fetched++ if fetched == n { fetching = false } } } fmt.Sprintln("") return nil }
func (q *Query) prepare(in toolkit.M) (out toolkit.M, e error) { out = toolkit.M{} quyerParts := q.Parts() c := crowd.From(&quyerParts) groupParts := c.Group(func(x interface{}) interface{} { return x.(*dbox.QueryPart).PartType }, nil).Exec() parts := map[interface{}]interface{}{} if len(groupParts.Result.Data().([]crowd.KV)) > 0 { for _, kv := range groupParts.Result.Data().([]crowd.KV) { parts[kv.Key] = kv.Value } } _, hasUpdate := parts[dbox.QueryPartUpdate] _, hasInsert := parts[dbox.QueryPartInsert] _, hasDelete := parts[dbox.QueryPartDelete] _, hasSave := parts[dbox.QueryPartSave] _, hasFrom := parts[dbox.QueryPartFrom] procedureParts, hasProcedure := parts["procedure"] var tableName string if hasFrom { fromParts, _ := parts[dbox.QueryPartFrom] tableName = fromParts.([]*dbox.QueryPart)[0].Value.(string) } else { return nil, err.Error(packageName, "Query", "prepare", "Invalid table name") } out.Set("tableName", tableName) if freeQueryParts, hasFreeQuery := parts["freequery"]; hasFreeQuery { var syntax string qsyntax := freeQueryParts.([]*dbox.QueryPart)[0].Value.(interface{}) syntax = qsyntax.(toolkit.M)["syntax"].(string) out.Set("freequery", syntax) out.Set("cmdType", dbox.QueryPartSelect) } else if hasInsert || hasUpdate || hasDelete || hasSave { if hasUpdate { out.Set("cmdType", dbox.QueryPartUpdate) } else if hasInsert { out.Set("cmdType", dbox.QueryPartInsert) } else if hasDelete { out.Set("cmdType", dbox.QueryPartDelete) } else if hasSave { out.Set("cmdType", dbox.QueryPartSave) } var where interface{} whereParts, hasWhere := parts[dbox.QueryPartWhere] if hasWhere { fb := q.Connection().Fb() for _, p := range whereParts.([]*dbox.QueryPart) { fs := p.Value.([]*dbox.Filter) for _, f := range fs { fb.AddFilter(f) } } where, e = fb.Build() if e != nil { } out.Set("where", where) } var dataM toolkit.M var dataMs []toolkit.M hasData := in.Has("data") var dataIsSlice bool if hasData { data := in.Get("data") if toolkit.IsSlice(data) { dataIsSlice = true e = toolkit.Unjson(toolkit.Jsonify(data), dataMs) if e != nil { return nil, err.Error(packageName, modQuery, "Exec: ", "Data encoding error: "+e.Error()) } } else { dataM, e = toolkit.ToM(data) dataMs = append(dataMs, dataM) if e != nil { return nil, err.Error(packageName, modQuery, "Exec: ", "Data encoding error: "+e.Error()) } } var id string var idVal interface{} if where == nil { id, idVal = toolkit.IdInfo(data) if id != "" { where = id + " = " + StringValue(idVal, "non") } out.Set("where", where) } if !dataIsSlice { var fields string var values string var setUpdate string var inc int for field, val := range dataM { stringval := StringValue(val, "non") if inc == 0 { fields = "(" + field values = "(" + stringval setUpdate = field + " = " + stringval } else { fields += ", " + field values += ", " + stringval setUpdate += ", " + field + " = " + stringval } inc++ } fields += ")" values += ")" if hasInsert || hasSave { out.Set("fields", fields) out.Set("values", values) } if hasUpdate || hasSave { out.Set("setUpdate", setUpdate) } } } } else if hasProcedure { cmd := procedureParts.([]*dbox.QueryPart)[0].Value.(interface{}) spName := cmd.(toolkit.M)["name"].(string) + " " params, hasParams := cmd.(toolkit.M)["params"] orderparam, hasOrder := cmd.(toolkit.M)["orderparam"] ProcStatement := "" toolkit.Println(spName, params, hasParams, orderparam, hasOrder, ProcStatement) } else { var selectField string incAtt := 0 if selectParts, hasSelect := parts[dbox.QueryPartSelect]; hasSelect { for _, sl := range selectParts.([]*dbox.QueryPart) { for _, fid := range sl.Value.([]string) { if incAtt == 0 { selectField = fid } else { selectField = selectField + ", " + fid } incAtt++ } } } out.Set("cmdType", dbox.QueryPartSelect) out.Set("selectField", selectField) /// /// not yet iimplement var aggrExp string if aggrParts, hasAggr := parts[dbox.QueryPartAggr]; hasAggr { incAtt := 0 for _, aggr := range aggrParts.([]*dbox.QueryPart) { /* isi qp : &{AGGR {$sum 1 Total Item}}*/ aggrInfo := aggr.Value.(dbox.AggrInfo) /* isi Aggr Info : {$sum 1 Total Item}*/ if incAtt == 0 { aggrExp = strings.Replace(aggrInfo.Op, "$", "", 1) + "(" + toolkit.ToString(aggrInfo.Field) + ")" + " as \"" + aggrInfo.Alias + "\"" } else { aggrExp += ", " + strings.Replace(aggrInfo.Op, "$", "", 1) + "(" + toolkit.ToString(aggrInfo.Field) + ")" + " as \"" + aggrInfo.Alias + "\"" } incAtt++ } } out.Set("aggr", aggrExp) /// /// Where Condition var where interface{} if whereParts, hasWhere := parts[dbox.QueryPartWhere]; hasWhere { fb := q.Connection().Fb() for _, p := range whereParts.([]*dbox.QueryPart) { for _, f := range p.Value.([]*dbox.Filter) { if in != nil { f = rdbms.ReadVariable(f, in) } fb.AddFilter(f) } } where, e = fb.Build() if e != nil { return nil, err.Error(packageName, modQuery, "prepare", e.Error()) } } out.Set("where", where) /// /// Sort Condition var sort []string if sortParts, hasSort := parts[dbox.QueryPartOrder]; hasSort { sort = []string{} for _, sr := range sortParts.([]*dbox.QueryPart) { for _, s := range sr.Value.([]string) { sort = append(sort, s) } } } out.Set("sort", sort) /// /// Take Condition take := 0 isTake := false if takeParts, hasTake := parts[dbox.QueryPartTake]; hasTake { isTake = true take = takeParts.([]*dbox.QueryPart)[0].Value.(int) } out.Set("isTake", isTake) out.Set("take", take) /// /// Skip Condition skip := 0 isSkip := false if skipParts, hasSkip := parts[dbox.QueryPartSkip]; hasSkip { isSkip = true skip = skipParts.([]*dbox.QueryPart)[0].Value.(int) } out.Set("isSkip", isSkip) out.Set("skip", skip) /// /// Group By Condition var groupExp string hasAggr := false if groupParts, hasGroup := parts[dbox.QueryPartGroup]; hasGroup { hasAggr = true for _, pg := range groupParts.([]*dbox.QueryPart) { for i, grValue := range pg.Value.([]string) { if i == 0 { groupExp += grValue } else { groupExp += ", " + grValue } } } } out.Set("group", groupExp) out.Set("hasAggr", hasAggr) /// /// Order By Condition var orderExp string if orderParts, hasOrder := parts[dbox.QueryPartOrder]; hasOrder { for _, ordrs := range orderParts.([]*dbox.QueryPart) { for i, oVal := range ordrs.Value.([]string) { if i == 0 { if string(oVal[0]) == "-" { orderExp = strings.Replace(oVal, "-", "", 1) + " DESC" } else { orderExp = oVal + " ASC" } } else { if string(oVal[0]) == "-" { orderExp += ", " + strings.Replace(oVal, "-", "", 1) + " DESC" } else { orderExp += ", " + oVal + " ASC" } } } } } out.Set("order", orderExp) } return }
func (c *Cursor) Fetch(m interface{}, n int, closeWhenDone bool) error { if closeWhenDone { defer c.Close() } e := c.prepIter() if e != nil { return errorlib.Error(packageName, modCursor, "Fetch", e.Error()) } // if !toolkit.IsPointer(m) { // return errorlib.Error(packageName, modCursor, "Fetch", "Model object should be pointer") // } if n != 1 && reflect.ValueOf(m).Elem().Kind() != reflect.Slice { return errorlib.Error(packageName, modCursor, "Fetch", "Model object should be pointer of slice") } // fmt.Println("LINE 112 : ", reflect.ValueOf(m).Elem().Kind()) // ds := dbox.NewDataSet(m) // rm := reflect.ValueOf(m) // // erm := rm.Elem() // fmt.Println(rm) // var v reflect.Type // if n == 1 { // v = reflect.TypeOf(m).Elem() // } else { // v = reflect.TypeOf(m).Elem().Elem() // } // iv := reflect.New(v).Elem() // vdatas := reflect.Indirect(m) // fmt.Println("LINE 131 : ", iv.Kind()) // vdatas := reflect.MakeSlice(reflect.SliceOf(v), 0, n) // fmt.Println("LINE 133 : ", reflect.TypeOf(vdatas)) datas := []toolkit.M{} lineCount := 0 //============================= for { isAppend := true c.count += 1 recData := toolkit.M{} appendData := toolkit.M{} dataTemp, e := c.reader.Read() for i, val := range dataTemp { recData[c.headerColumn[i].name] = val if len(c.ConditionVal.Select) == 0 || c.ConditionVal.Select.Get("*", 0).(int) == 1 { appendData[c.headerColumn[i].name] = val } else { if c.ConditionVal.Select.Get(c.headerColumn[i].name, 0).(int) == 1 { appendData[c.headerColumn[i].name] = val } } } isAppend = c.ConditionVal.getCondition(recData) if c.count < c.ConditionVal.skip || (c.count > (c.ConditionVal.skip+c.ConditionVal.limit) && c.ConditionVal.limit > 0) { isAppend = false } if e == io.EOF { if isAppend && len(appendData) > 0 { datas = append(datas, appendData) lineCount += 1 } break } else if e != nil { return errorlib.Error(packageName, modCursor, "Fetch", e.Error()) } if isAppend && len(appendData) > 0 { datas = append(datas, appendData) lineCount += 1 } if n > 0 { if lineCount >= n { break } } } // if iv.Kind() == reflect.Map { // } else if iv.Kind() == reflect.Struct { // } e = toolkit.Unjson(toolkit.Jsonify(datas), m) if e != nil { return errorlib.Error(packageName, modCursor, "Fetch", e.Error()) } // bs, _ := json.Marshal(datas) // _ = json.Unmarshal(bs, m) // reflect.ValueOf(m).Elem().Set(reflect.ValueOf(datas)) return nil }
func (q *Query) Exec(in toolkit.M) error { setting, e := q.prepare(in) commandType := setting["commandtype"].(string) //toolkit.Printf("Command type: %s\n", commandType) if e != nil { return err.Error(packageName, modQuery, "Exec: "+commandType, e.Error()) } if setting.GetString("commandtype") == dbox.QueryPartSelect { return err.Error(packageName, modQuery, "Exec: "+commandType, "Exec is not working with select command, please use .Cursor instead") } q.Lock() defer q.Unlock() var dataM toolkit.M var dataMs []toolkit.M hasData := in.Has("data") dataIsSlice := false data := in.Get("data") if toolkit.IsSlice(data) { dataIsSlice = true e = toolkit.Unjson(toolkit.Jsonify(data), dataMs) if e != nil { return err.Error(packageName, modQuery, "Exec: "+commandType, "Data encoding error: "+e.Error()) } } else { dataM, e = toolkit.ToM(data) dataMs = append(dataMs, dataM) if e != nil { return err.Error(packageName, modQuery, "Exec: "+commandType, "Data encoding error: "+e.Error()) } } hasWhere := setting.Has("where") where := setting.Get("where", []*dbox.Filter{}).([]*dbox.Filter) if hasWhere && len(where) == 0 { inWhere := in.Get("where") if inWhere == nil { hasWhere = false where = nil } else { if !toolkit.IsSlice(inWhere) { where = append(where, inWhere.(*dbox.Filter)) } else { where = inWhere.([]*dbox.Filter) } } } if hasData && hasWhere == false && toolkit.HasMember([]interface{}{dbox.QueryPartInsert, dbox.QueryPartDelete, dbox.QueryPartUpdate, dbox.QueryPartSave}, commandType) { hasWhere = true //toolkit.Println("check where") if toolkit.IsSlice(data) { ids := []interface{}{} idField := "" if idField == "" { return err.Error(packageName, modQuery, "Exec: "+commandType, "Data send is a slice, but its element has no ID") } dataCount := toolkit.SliceLen(data) for i := 0; i < dataCount; i++ { dataI := toolkit.SliceItem(data, i) if i == 0 { idField = toolkit.IdField(dataI) } ids = append(ids, toolkit.Id(dataI)) } where = []*dbox.Filter{dbox.In(idField, ids)} } else { idfield := "_id" id := toolkit.Id(data) if !toolkit.IsNilOrEmpty(id) { where = []*dbox.Filter{dbox.Eq(idfield, id)} } else { where = nil hasWhere = false } } } /* toolkit.Printf("CommandType: %s HasData: %v HasWhere: %v Where: %s\n", commandType, hasData, hasWhere, toolkit.JsonString(where)) */ e = q.openFile(commandType) //toolkit.Printf(commandType+" Open File, found record: %d\nData:%s\n", len(q.data), toolkit.JsonString(q.data)) if e != nil { return err.Error(packageName, modQuery, "Exec: "+commandType, e.Error()) } var indexes []interface{} if hasWhere && commandType != dbox.QueryPartInsert { whereIndex := dbox.Find(q.data, where) indexes = toolkit.ToInterfaceArray(&whereIndex) //toolkit.Printf("Where Index: %s Index:%s\n", toolkit.JsonString(whereIndex), toolkit.JsonString(indexes)) } if commandType == dbox.QueryPartInsert { if !hasData { return err.Error(packageName, modQuery, "Exec: "+commandType, "Data is empty") } if !dataIsSlice { dataMs = []toolkit.M{dataM} } //-- validate for _, datam := range dataMs { idField, idValue := toolkit.IdInfo(datam) toolkit.Serde(dbox.Find(q.data, []*dbox.Filter{dbox.Eq(idField, idValue)}), &indexes, "") if len(indexes) > 0 { return err.Error(packageName, modQuery, "Exec: "+commandType, toolkit.Sprintf("Data %v already exist", idValue)) } } //-- insert the data q.data = append(q.data, dataMs...) } else if commandType == dbox.QueryPartUpdate { //-- valida if !hasData { return err.Error(packageName, modQuery, "Exec: "+commandType, "Data is empty") } var dataUpdate toolkit.M var updateDataIndex int // if it is a slice then we need to update each data passed on its slice isDataSlice := toolkit.IsSlice(data) if isDataSlice == false { isDataSlice = false e = toolkit.Serde(data, &dataUpdate, "") if e != nil { return err.Error(packageName, modQuery, "Exec: "+commandType, "Serde data fail"+e.Error()) } } var idField string //toolkit.Printf("Indexes: %s\n", toolkit.JsonString(indexes)) for i, v := range q.data { // update only data that match given inde if toolkit.HasMember(indexes, i) || !hasWhere { if idField == "" { idField = toolkit.IdField(v) if idField == "" { return err.Error(packageName, modQuery, "Exec: "+commandType, "No ID") } } // If dataslice is sent, iterate f if isDataSlice { e = toolkit.Serde(toolkit.SliceItem(data, updateDataIndex), &dataUpdate, "") if e != nil { return err.Error(packageName, modQuery, "Exec: "+commandType, "Serde data fail "+e.Error()) } updateDataIndex++ } dataOrigin := q.data[i] toolkit.CopyM(&dataUpdate, &dataOrigin, false, []string{"_id"}) toolkit.Serde(dataOrigin, &v, "") q.data[i] = v } } } else if commandType == dbox.QueryPartDelete { if hasWhere && len(where) > 0 { indexes := dbox.Find(q.data, where) if len(indexes) > 0 { newdata := []toolkit.M{} for index, v := range q.data { partOfIndex := toolkit.HasMember(indexes, index) if partOfIndex == false { newdata = append(newdata, v) } //toolkit.Println("i:", indexes, ", index:", index, ", p.ofIndex: ", partOfIndex, ", data: ", toolkit.JsonString(newdata)) } q.data = newdata } } else { q.data = []toolkit.M{} } //toolkit.Printf("Data now: %s\n", toolkit.JsonString(q.data)) } else if commandType == dbox.QueryPartSave { if !hasData { return err.Error(packageName, modQuery, "Exec: "+commandType, "Data is empty") } var dataMs []toolkit.M var dataM toolkit.M if !toolkit.IsSlice(data) { e = toolkit.Serde(&data, &dataM, "json") if e != nil { return err.Error(packageName, modQuery, "Exec: "+commandType+" Serde data fail", e.Error()) } dataMs = append(dataMs, dataM) } else { e = toolkit.Serde(&data, &dataMs, "json") if e != nil { return err.Error(packageName, modQuery, "Exec: "+commandType+" Serde data fail", e.Error()) } } //toolkit.Printf("Saving: %s\n", toolkit.JsonString(dataMs)) for _, v := range dataMs { idField, idValue := toolkit.IdInfo(v) indexes := dbox.Find(q.data, []*dbox.Filter{dbox.Eq(idField, idValue)}) if len(indexes) == 0 { q.data = append(q.data, v) } else { dataOrigin := q.data[indexes[0]] //toolkit.Printf("Copy data %s to %s\n", toolkit.JsonString(v), toolkit.JsonString(dataOrigin)) toolkit.CopyM(&v, &dataOrigin, false, []string{idField}) q.data[indexes[0]] = dataOrigin } } } e = q.writeFile() if e != nil { return err.Error(packageName, modQuery, "Exec: "+commandType+" Write fail", e.Error()) } return nil }
func (q *Query) Exec(parm toolkit.M) error { var ( e error updatedValue, dataMs []toolkit.M dataM toolkit.M ) filters, e := q.Filters(parm) if e != nil { return errorlib.Error(packageName, modQuery, "Exec", e.Error()) } if parm == nil { parm = toolkit.M{} } data := parm.Get("data", nil) filePath := q.Connection().(*Connection).filePath commandType := filters.Get("cmdType").(string) hasWhere := filters.Has("where") hasCmdType := toolkit.M{} hasData := parm.Has("data") getWhere := filters.Get("where", []*dbox.Filter{}).([]*dbox.Filter) dataIsSlice := toolkit.IsSlice(data) if dataIsSlice { e = toolkit.Unjson(toolkit.Jsonify(data), &dataMs) if e != nil { return errorlib.Error(packageName, modQuery, "Exec: "+commandType, "Data encoding error: "+e.Error()) } for _, v := range dataMs { id := toolkit.Id(v) idF := toolkit.IdField(v) if toolkit.IsNilOrEmpty(id) { return errorlib.Error(packageName, modCursor+".Exec", commandType, "Unable to find ID in slice data") } else { getWhere = []*dbox.Filter{dbox.Eq(idF, id)} } } } else { dataM, e = toolkit.ToM(data) if e != nil { return errorlib.Error(packageName, modQuery, "Exec: "+commandType, "Unable to Map, error: "+e.Error()) } id := toolkit.Id(dataM) if !toolkit.IsNilOrEmpty(id) { getWhere = []*dbox.Filter{dbox.Eq(toolkit.IdField(dataM), id)} } } var dataMaps []toolkit.M q.ReadFile(&dataMaps, filePath) if commandType == dbox.QueryPartInsert { hasCmdType.Set("hasInsert", true) if !hasData { return errorlib.Error(packageName, modCursor+".Exec", commandType, "Sorry data not found!, unable to insert data") } result := dbox.Find(dataMaps, getWhere) if len(result) > 0 { return errorlib.Error(packageName, modCursor+".Exec", commandType, "ID already exist, unable insert data ") } if dataIsSlice { var sliceData []toolkit.M for _, v := range dataMs { sliceData = finUpdateObj(dataMaps, v, "insert") } updatedValue = sliceData } else { updatedValue = finUpdateObj(dataMaps, dataM, "insert") } } else if commandType == dbox.QueryPartUpdate { hasCmdType.Set("hasUpdate", true) if !hasData { return errorlib.Error(packageName, modCursor+".Exec", commandType, "Sorry data not found!, unable to update data") } if hasWhere { var indexes []interface{} whereIndex := dbox.Find(dataMaps, getWhere) indexes = toolkit.ToInterfaceArray(&whereIndex) // toolkit.Printf("whereIndex>%v indexes%v\n", whereIndex, indexes) var dataUpdate toolkit.M var updateDataIndex int isDataSlice := toolkit.IsSlice(data) if isDataSlice == false { isDataSlice = false data, e = toolkit.ToM(data) if e != nil { return errorlib.Error(packageName, modQuery, "Exec: "+commandType, "Serde data fail"+e.Error()) } e = toolkit.Serde(data, &dataUpdate, "") if e != nil { return errorlib.Error(packageName, modQuery, "Exec: "+commandType, "Serde data fail"+e.Error()) } } for i, v := range dataMaps { if toolkit.HasMember(indexes, i) || !hasWhere { if isDataSlice { e = toolkit.Serde(toolkit.SliceItem(data, updateDataIndex), &dataUpdate, "") if e != nil { return errorlib.Error(packageName, modQuery, "Exec: "+commandType, "Serde data fail"+e.Error()) } updateDataIndex++ } dataOrigin := dataMaps[i] toolkit.CopyM(&dataUpdate, &dataOrigin, false, []string{"_id"}) toolkit.Serde(dataOrigin, &v, "") dataMaps[i] = v } } updatedValue = dataMaps } else { updatedValue = finUpdateObj(dataMaps, dataM, "update") } } else if commandType == dbox.QueryPartDelete { hasCmdType.Set("hasDelete", true) // if multi { if hasWhere { result := dbox.Find(dataMaps, getWhere) if len(result) > 0 { for i, v := range dataMaps { if toolkit.HasMember(result, i) == false { updatedValue = append(updatedValue, v) } } } } else { updatedValue = []toolkit.M{} } } else if commandType == dbox.QueryPartSave { hasCmdType.Set("hasSave", true) if !hasData { return errorlib.Error(packageName, modCursor+".Exec", commandType, "Sorry data not found!, unable to update data") } q.dataType = "save" q.whereData = append(q.whereData, getWhere...) q.sliceData = append(q.sliceData, dataM) } if hasCmdType.Has("hasInsert") || hasCmdType.Has("hasUpdate") || hasCmdType.Has("hasDelete") { e = q.WriteFile(updatedValue) if e != nil { return errorlib.Error(packageName, modQuery+".Exec", commandType, e.Error()) } } return nil }