func (p *printer) file(src *ast.File) { p.setComment(src.Doc) p.print(src.Pos(), token.PACKAGE, blank) p.expr(src.Name) p.declList(src.Decls) p.print(newline) }
func convertAST(info *types.Info, src []byte, sgoAST *ast.File, fset *token.FileSet) []byte { c := converter{ Info: info, src: src, base: fset.File(sgoAST.Pos()).Base() - 1, fset: fset, } c.convertFile(sgoAST) return bytes.Join(append(c.dstChunks, src[c.lastChunkEnd:]), nil) }
func fileWithAnnotationComments(file *ast.File, fset, oldFset *token.FileSet, src []byte) ([]byte, *ast.File, error) { // TODO: So this is an extremely hacky way of doing this. We're going to // add the comments directly to the source comments, as text, and then // we're going to re-parse it. This is because I tried manipulating the // AST, adding the commments there an shifting the nodes' positions, but // doing that right is very very convoluted; you need to be tracking all // the time where you are, where you _were_, figure out where's a line // break, etc. So, well, this will do for now. var err error var dstChunks [][]byte var lastChunkEnd int skipNextSpec := false addDoc := func(node ast.Node, name *ast.Ident, typ ast.Expr) { if typ == nil { return } if name != nil && len(name.Name) > 0 { c := name.Name[0] if !(c >= 'A' && c <= 'Z') { return } } buf := &bytes.Buffer{} err = printer.Fprint(buf, token.NewFileSet(), typ) if err != nil { return } pos := int(node.Pos()) - oldFset.File(file.Pos()).Base() var space []byte for i := pos - 1; i >= 0 && (src[i] == ' ' || src[i] == '\t'); i-- { space = append([]byte{src[i]}, space...) } text := append([]byte("// For SGo: "+buf.String()+"\n"), space...) dstChunks = append(dstChunks, src[lastChunkEnd:pos], text) lastChunkEnd = pos } var visitor visitorFunc visitor = visitorFunc(func(node ast.Node) (w ast.Visitor) { var typ ast.Expr var name *ast.Ident switch node := node.(type) { case *ast.FuncDecl: typ = node.Type name = node.Name case *ast.GenDecl: if node.Lparen != 0 || node.Tok == token.IMPORT || node.Tok == token.CONST { return visitor } switch spec := node.Specs[0].(type) { case *ast.TypeSpec: skipNextSpec = true typ = spec.Type name = spec.Name case *ast.ValueSpec: skipNextSpec = true typ = spec.Type if len(spec.Names.List) > 0 { name = spec.Names.List[0] } } switch typ.(type) { case *ast.InterfaceType, *ast.StructType: return visitor } case *ast.InterfaceType: for i := 0; i < len(node.Methods.List); i++ { item := node.Methods.List[i] if len(item.Names) > 0 { name = item.Names[0] } addDoc(item, name, item.Type) } return visitor case *ast.StructType: for i := 0; i < len(node.Fields.List); i++ { item := node.Fields.List[i] if len(item.Names) > 0 { name = item.Names[0] } addDoc(item, name, item.Type) } return visitor case *ast.TypeSpec: if skipNextSpec { skipNextSpec = false return visitor } typ = node.Type name = node.Name case *ast.ValueSpec: if skipNextSpec { skipNextSpec = false return visitor } typ = node.Type if len(node.Names.List) > 0 { name = node.Names.List[0] } default: return visitor } addDoc(node, name, typ) return visitor }) ast.Walk(visitor, file) if err != nil { return nil, nil, err } dst := append( []byte("// Autogenerated by SGo revision: "+SGoRevision+"\n// DO NOT EDIT!\n\n"), bytes.Join(append(dstChunks, src[lastChunkEnd:]), nil)...) dstFile, err := parser.ParseFile(fset, file.Name.Name, dst, parser.ParseComments) return dst, dstFile, err }